"Neutron" Can artificial intelligence outperform doctors in predicting medical complications? : Health care

"Neutron" Can artificial intelligence outperform doctors in predicting medical complications? : Health care

As part of efforts to improve health care conditions, a team of specialists at an American university has succeeded in developing a program that can predict medical complications for patients.

AI has already proven its ability to analyze images of medical devices, pass exams for medical students... Now it's the turn of a new AI-based tool to prove that it can read doctors' reports and accurately predict the risk of death, readmission and other potential complications .

The program was devised by a team from the Grossman School of Medicine of the Langone Department of Health Studies at New York University, and the program is currently being tested in a number of partner hospitals of the university with the aim of generalizing the use of this technology in the medical community in the future.

On Wednesday, a study was published in the scientific journal "Nature" on the benefits that could accrue from using this program.

The lead author of the study, neurosurgeon and computer engineer at New York Medical College Eric Orman, explained that non-AI-based predictive models have been around for a long time, but their use is limited in practice because manipulating data is a heavy process.

"The common thing in medical work everywhere is that doctors take notes of what they see and what they talk about with patients," he told AFP.

He explained that the researchers' main idea was to "see if it is possible to rely on medical observations as a source of data, and to build predictive models from them."

The prediction model, called NYUTron, was built from millions of medical notes in the files of 387,000 patients treated between January 2011 and May 2020 at hospitals associated with New York University.

These notes included doctors' written reports, notes on patient progress, X-ray images and medical devices, and recommendations given to patients upon discharge from the hospital, totaling 4.1 billion words.

The most important challenge of the program was the success in interpreting the language used by the doctors, as each of them has its own terminology that differs greatly from the other, especially in terms of abbreviations.

They also tested the tool in real conditions, in particular by training it to analyze reports from a Manhattan hospital, and then compare the results with those of a Brooklyn hospital for different patients.

By studying what actually happened to the patients, the researchers were able to measure how often the program's predictions were correct.

Not a substitute
The result was surprising, as it was found that the Neutron program was able to predict, before patients were discharged from partner hospitals, the death of 95% of those who actually died later, and its predictions were correct for 80% of those who were readmitted to hospitals less than a month after their discharge.

These results were more accurate than most physicians expected, and also exceeded the expectations of current non-AI-based informational models.

However, the surprise was that a very experienced doctor, widely respected in the medical community, gave predictions "even better than those given by the program," Eric Orman explained.

The program was also successful, with a percentage of 79%, in predicting the length of stay of patients in the hospital, and with a percentage of 87% in predicting cases in which guarantors and insurance companies would not cover medical care expenses paid by patients, and with a percentage of 89% in predicting cases in which patients suffered additional health problems.

Dr. Orman stressed that AI will never replace the patient-physician relationship, but it could allow "more information for physicians to enable them to make informed decisions."

Post a Comment

Previous Post Next Post

Search Here For Top Offers