Google AI model beats humans in detecting breast cancer

Google AI model beats humans in detecting breast cancer
In the study, researchers sought to determine if they could skip the most complicated steps of this workflow.

San Francisco: In a ray of hope for those who have to go for breast cancer screening and even for healthy women who get false alarms during digital mammography, an Artificial Intelligence (AI)-based Google model has left radiologists behind in spotting breast cancer by just scanning the X-ray results.

Reading mammograms is a difficult task, even for experts, and can often result in both false positives and false negatives.

In turn, these inaccuracies can lead to delays in detection and treatment, unnecessary stress for patients and a higher workload for radiologists who are already in short supply, Google said in a blog post.

Google's AI model spotted breast cancer in de-identified screening mammograms (where identifiable information has been removed) with greater accuracy, fewer false positives and fewer false negatives than experts.

"This sets the stage for future applications where the model could potentially support radiologists performing breast cancer screenings," said Shravya Shetty, Technical Lead, Google Health.

Digital mammography or X-ray imaging of the breast, is the most common method to screen for breast cancer, with over 42 million exams performed each year in the US and the UK combined.

"But despite the wide usage of digital mammography, spotting and diagnosing breast cancer early remains a challenge," said Daniel Tse, Product Manager, Google Health.

Together with colleagues at DeepMind, Cancer Research UK Imperial Centre, Northwestern University and Royal Surrey County Hospital, Google set out to see if AI could support radiologists to spot the signs of breast cancer more accurately.

The findings, published in the journal Nature, showed that AI could improve the detection of breast cancer.

Google AI model was trained and tuned on a representative data set comprised of de-identified mammograms from more than 76,000 women in the UK and more than 15,000 women in the US, to see if it could learn to spot signs of breast cancer in the scans.

The model was then evaluated on a separate de-identified data set of more than 25,000 women in the UK and over 3,000 women in the US.

"In this evaluation, our system produced a 5.7 per cent reduction of false positives in the US, and a 1.2 per cent reduction in the UK. It produced a 9.4 per cent reduction in false negatives in the US, and a 2.7 per cent reduction in the UK," informed Google.

The researchers then trained the AI model only on the data from the women in the UK and then evaluated it on the data set from women in the US.

In this separate experiment, there was a 3.5 per cent reduction in false positives and an 8.1 per cent reduction in false negatives, "showing the model's potential to generalize to new clinical settings while still performing at a higher level than experts".

Notably, when making its decisions, the model received less information than human experts did.

The human experts (in line with routine practice) had access to patient histories and prior mammograms, while the model only processed the most recent anonymized mammogram with no extra information.

Despite working from these X-ray images alone, the model surpassed individual experts in accurately identifying breast cancer.

This work, said Google, is the latest strand of its research looking into detection and diagnosis of breast cancer, not just within the scope of radiology, but also pathology.

"We're looking forward to working with our partners in the coming years to translate our machine learning research into tools that benefit clinicians and patients," said the tech giant.

The comments posted here/below/in the given space are not on behalf of Onmanorama. The person posting the comment will be in sole ownership of its responsibility. According to the central government's IT rules, obscene or offensive statement made against a person, religion, community or nation is a punishable offense, and legal action would be taken against people who indulge in such activities.