The great diffusion of convolutional neural networks and transformers for image classification made the demand for transparency of deep models more urgent, especially when the decisions involve delicate issues, such as the health of people. In this paper we present some metrics to evaluate the performances of visual explanations of AI classifiers compared to the annotations of expert practitioners. These metrics are used to evaluate whether the state-of-the-art deep models for X-ray image classification are capturing the right information and to identify the most effective methods in providing explanations from black-box models. Insights from this analysis, carried out on the most comprehensive dataset about thoracic diseases (ChestX-ray14), could help practitioners in understanding potentialities and limitations of deep models in computer-aided diagnosis.

Evaluating Local Explainable AI Techniques for the Classification of Chest X-Ray Images

Claudio Estatico;Damiano Verda;Enrico Ferrari
2024-01-01

Abstract

The great diffusion of convolutional neural networks and transformers for image classification made the demand for transparency of deep models more urgent, especially when the decisions involve delicate issues, such as the health of people. In this paper we present some metrics to evaluate the performances of visual explanations of AI classifiers compared to the annotations of expert practitioners. These metrics are used to evaluate whether the state-of-the-art deep models for X-ray image classification are capturing the right information and to identify the most effective methods in providing explanations from black-box models. Insights from this analysis, carried out on the most comprehensive dataset about thoracic diseases (ChestX-ray14), could help practitioners in understanding potentialities and limitations of deep models in computer-aided diagnosis.
2024
9783031638022
9783031638039
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11567/1289637
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact