Beware explanations from AI in health care
Abstract
Get full access to this article
View all available purchase options and get full access to this article.
References and Notes
Information & Authors
Information
Published In

16 July 2021
Copyright
Submission history
Acknowledgments
Authors
Metrics & Citations
Metrics
Article Usage
Altmetrics
Citations
Export citation
Select the format you want to export the citation of this publication.
View Options
Get Access
Log in to view the full text
AAAS login provides access to Science for AAAS members, and access to other journals in the Science family to users who have purchased individual subscriptions.
- Become a AAAS Member
- Activate your Account
- Purchase Access to Other Journals in the Science Family
- Account Help
More options
Purchase digital access to this article
Download and print this article for your personal scholarly, research, and educational use.
Buy a single issue of Science for just $15 USD.
View options
PDF format
Download this article as a PDF file
Download PDF






RE: Beware explanations from AI in health care
<p>To the Editor: <br />Babic et al. (1) recognize the emerging consensus favoring explainable artificial intelligence (AI) over cryptic (black-box) AI. We agree that it is detrimental to require transparency in AI if the system requires post hoc rationalizations of black-box diagnoses. We offer an alternate path for AI that both improves transparency and enables gains in system performance.</p> <p>This emerging image processing technique fuses visible structure information with black-box AI parameters in the classifier network to improve diagnostic accuracy for images in various diagnostic classes (2-5).</p> <p>If conventional image processing techniques can recognize a structure, and the diagnostic system allows the fusion of AI information with this structure, the output can obtain higher diagnostic accuracy and explain the correct diagnosis. The fusion concept can be extended to any additional structural data or clinical metadata that the black-box AI system lacks, such as a dimple sign for a benign dermatofibroma. Fusion of the AI information with the clinical sign again offers a boost in diagnostic accuracy and transparency. Such clinical signs are beneficial for a so-called long-tailed diagnosis such as dermatofibroma, which lacks sufficient training examples (6).</p> <p>The optimal role for AI in the diagnostic process is to serve as a diagnostic adjunct that assists but does not replace the clinician by providing supportive information for diagnostic assessment. The clinician is responsible for the care of the patient, not the AI software.</p> <p>The fusion of structure identification with AI offers a significant advantage in diagnostic performance for atypical or uncommon diagnostic subtypes that clinicians and AI systems have difficulty identifying, such as amelanotic melanoma (7, 8). Fusion of vessel structure parameters with AI techniques improves diagnostic performance for melanoma (2). Ultimately, the bridge between AI and the clinician requires identifying all critical information, including recognizable structures and metadata, thereby improving the final diagnostic accuracy.</p> <p> </p> <p><strong>REFERENCES AND NOTES</strong><br />1. B. Babic et al., Science 37, 284 (2021). <br />2. J. Hagerty et al., IEEE J. Biomed. Health. Inf. 23, 1385 (2019). <br />3. S. Li et al.,Phys. Med. Biol. 64, 175012 (2019). <br />4. N. Antropova et al., Med. Phys. 44, 162 (2017). <br />5. P. Guo et al., IEEE J. Biomed. Health. Inf. 20, 1595 (2016). <br />6. International Skin Imaging Collaboration; https://www.isic-archive.com/#!/topWithHeader/wideContentTop/main. <br />7. W. V. Stoecker, W. Stolz, Arch Dermatol. 144, 1207 (2008). <br />8. S. W. Menzies et al., Arch Dermatol. 144, 1120 (2008).</p>