English  |  正體中文  |  简体中文  |  Items with full text/Total items : 17927/22946 (78%)
Visitors : 7264753      Online Users : 356
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    Please use this identifier to cite or link to this item: https://ir.csmu.edu.tw:8080/ir/handle/310902500/22362


    Title: Deep Learning for Automated Diabetic Retinopathy Screening Fused With Heterogeneous Data From EHRs Can Lead to Earlier Referral Decisions
    Authors: Hsu, Min-Yen;Chiou, Jeng-Yuan;Liu, Jung-Tzu;Lee, Chee-Ming;Lee, Ya-Wen;Chou, Chien-Chih;Lo, Shih-Chang;Kornelius, Edy;Yang, Yi-Sun;Chang, Sung-Yen;Liu, Yu-Cheng;Huang, Chien-Ning;Tseng, Vincent S
    Date: 2021-08-02
    Issue Date: 2022-05-30T03:49:40Z (UTC)
    Abstract: Purpose: Fundus images are typically used as the sole training input for automated diabetic retinopathy (DR) classification. In this study, we considered several well-known DR risk factors and attempted to improve the accuracy of DR screening.

    Metphods: Fusing nonimage data (e.g., age, gender, smoking status, International Classification of Disease code, and laboratory tests) with data from fundus images can enable an end-to-end deep learning architecture for DR screening. We propose a neural network that simultaneously trains heterogeneous data and increases the performance of DR classification in terms of sensitivity and specificity. In the current retrospective study, 13,410 fundus images and their corresponding nonimage data were collected from the Chung Shan Medical University Hospital in Taiwan. The images were classified as either nonreferable or referable for DR by a panel of ophthalmologists. Cross-validation was used for the training models and to evaluate the classification performance.

    Results: The proposed fusion model achieved 97.96% area under the curve with 96.84% sensitivity and 89.44% specificity for determining referable DR from multimodal data, and significantly outperformed the models that used image or nonimage information separately.

    Conclusions: The fusion model with heterogeneous data has the potential to improve referable DR screening performance for earlier referral decisions.

    Translational relevance: Artificial intelligence fused with heterogeneous data from electronic health records could provide earlier referral decisions from DR screening.
    URI: https://ir.csmu.edu.tw:8080/handle/310902500/22362
    Relation: Transl Vis Sci Technol,10(9),18
    Appears in Collections:[醫學系] 期刊論文

    Files in This Item:

    File Description SizeFormat
    i2164-2591-10-9-18_1629788375.00135.pdf1066KbAdobe PDF143View/Open


    SFX Query

    All items in CSMUIR are protected by copyright, with all rights reserved.


    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback