A novel hybrid deep learning architecture for predicting acute kidney injury using patient record data and ultrasound kidney images.

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • Additional Information
    • Abstract:
      Acute kidney injury (AKI) is a sudden onset of kidney damage. Currently, there is no hybrid model predicting AKI that takes advantage of two types of data. In this research, a novel hybrid deep learning architecture for AKI prediction was created using de-identified numeric patient data and ultrasound kidney images. Using data features including serum creatinine among others, two numeric models using MIMIC-III and paired hospital data were developed, and with the hospital ultrasounds, an image-only model was developed. Convolutional neural networks (CNN) were used, including VGG and Resnet, and they were made into a hybrid model by concatenating feature maps of both types of models to create a new input. The hybrid model successfully predicted AKI and the highest AUROC of the model was 0.953, the first time an AKI machine learning model surpassed an AUROC of 0.9. The model also achieved an accuracy of 90% and F1-score of 0.91. This model can be implemented into urgent clinical settings such as the ICU and aid doctors by assessing the risk of AKI shortly after the patient's admission to the ICU. The implementation of this research has great potential to be applied to different medical predictive applications. [ABSTRACT FROM AUTHOR]
    • Abstract:
      Copyright of Applied Artificial Intelligence is the property of Taylor & Francis Ltd and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)