Generalization of chest x-ray disease prediction by learning feature representations agnostic to clinical domains
Chest radiography is a common and essential examination tool in medical practice for the diagnosis of lung diseases. Recent approaches in artificial intelligence have demonstrated that transfer learning of deep learning models was able to provide performance gains at the level of practicing radiologists. These techniques transfer the features learned on ImageNet to medical data through fine-tuning pre-trained deep convolutional networks. They have used public chest x-ray datasets with train and test inputs of the same distributions. Unfortunately, the performance of deep learning models degrades significantly on test data coming from a different domain. In clinical applications, there is a significant shift between train (source) and test (target) domains. Also, the target data are not always present during training. Building a diagnostic model transferable to different clinical domains and populations is still an open challenge. This project focuses on exploring novel techniques of transfer learning, explicitly domain generalization, and on developing a novel model capable of learning feature representations agnostic to domains shift. The originality of this work will allow the intern to submit a research paper at the end of the program. The company will then exploit the developed algorithms to validate the diagnostic model on two clinical sites.