Deep learning shows promise for reading chest x-rays

Monday, November 28 | 12:15 p.m.-12:45 p.m. | IN211-SD-MOA2 | Lakeside, IN Community, Station 2
A deep-learning method could be used to provide a more "human-like" diagnosis on chest x-rays, according to a group from the U.S. National Institutes of Health (NIH) Clinical Center.

Research and applications for computer-aided detection (CAD) have so far been limited to detecting particular diseases for which they have been particularly targeted and trained for, noted senior author Dr. Ronald Summers, PhD.

"Nonetheless, there exists a vast amount of electronic health records and images in PACS from which a more 'general' framework for training an advanced CAD [software] can be developed," Summers told AuntMinnie.com.

Following up on previous work at the NIH on detecting the presence or absence of a number of common diseases from patient scans, the team sought to explore the possibility of automatically generating a more comprehensive and human-like diagnosis by learning from a database of patient scans and radiology reports. They used chest x-rays and report summaries with Medical Subject Headings (MeSH) standardized terms to train deep-learning systems comprised of convolutional neural networks and recurrent neural networks.

"The trained system can not only detect a disease from a chest x-ray, but also can describe its context, such as its size, location, affected organ, severity, etc.," Summers said. "While we demonstrate how we can get closer to a more 'human-like' and 'general' diagnosis system, more research needs to be done to improve its performance."

Visit the Lakeside Center to learn more about the project from the NIH team, which included presenter Hoo-Chang Shin, PhD; Kirk Roberts; and Le Lu, PhD.

Page 1 of 385
Next Page