Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Healthcare is turning to AI for ‘scribes’ who take medical notes


Investment in medical data-capturing technologies is set to double by 2024, as Big Tech giants including Microsoft and Amazon and startups race to capture a share of the $26bn AI healthcare market life.

AI startups focused on creating digital “writers” for healthcare professionals have raised $800mn in 2024, compared to $390mn in 2023, according to data from PitchBook.

Start-ups such as Nabla, Heidi, Corti and Tortus raised funds last year, by supporters including Khosla Ventures, Entrepreneur First and French tech billionaire Xavier Niel.

The increase in funding comes as teams rush to start Products powered by AI which aims to enable doctors to quickly capture clinical data and improve patient interactions, as healthcare becomes a key growth area in the AI ​​boom.

Microsoft, which owns AI speech recognition company Nuance, along with Amazon and Oracle have announced so-called AI co-pilots for doctors that use large-scale speech models and speech recognition to automate records of patient visits, highlighting medical-related information and creating. clinical overview.

© Tiffany Hagler-Geard/Bloomberg

“I don’t think I’ve seen anything change in 15 years of health care more than this,” said Harpreet Sood, a primary care doctor in South London, who is trying to launch the Nabla program in France. 15 months ago. .

Sood, a former technology and innovation adviser to the chief executive of NHS England, said that in a full-day clinic of around 40 patients, traditional data-taking would take at least two hours of typing.

“It’s been amazing, easily saving 3-4 minutes of each conversation (10 minutes) and really helping to capture the conversation and what it’s about,” he added.

Nabla’s data capture tool uses Whisper, a publishing tool from ChatGPT developer OpenAI, and has been used to document nearly 7 million medical visits since October last year.

Hospitals and general practitioners across the UK’s National Health Service are experimenting with AI data as a way to save time and improve doctor-patient relationships. According to a Mayo Clinic study, doctors spend one-third of their day on administrative work, such as paperwork.

Meanwhile, Microsoft said that Nuance’s DAX Copilot tool, which launched more than a year ago, now records more than 1.3 million doctor-patient encounters each month in 500 U.S. healthcare teams. life.

Nuance, which Microsoft is buying for $20bn by 2022, said the AI ​​tool cuts the time doctors spend on medical documentation by 50 per cent.

© Jeff Gilbert/Alamy

At Stanford Medical School, more than 50 primary care physicians tested Nuance’s AI-driven data collection in 2024, and two-thirds of users said it saved time.

The data produced by the AI ​​was carefully checked by doctors for accuracy, and the vast majority, about 90 percent, had to be edited manually to correct errors, said a person familiar with the matter. and test.

However, the results have encouraged Stanford to plan the rollout of DAX Copilot to all of its providers.

Sood said that while reviewing each report generated by the Nabla app, the mental capacity to write and listen at the same time during the interview is “reduced, if not completely removed” by the device.

“You can focus more on the patient, you listen, you are more present, you understand their body language. I enjoy my interviews more now,” he added.

However, the rise of medical data capture has prompted criticism from researchers about the dangers of AI-generated illusions, known as “hallucinations”, which can be very dangerous in medical settings, as well as the question of patient data privacy.

Researchers at Cornell University and the University of Virginia analyzed thousands of transcripts produced by Whisper since 2023 and found that about 1% of audio transcripts contained “idiomatic expressions or expressions that were not present in any way in the underlying audio “.

About 40 percent of the views included harmful content, such as promoting violence, or engaging in inappropriate relationships, the study said.

“I wouldn’t just rely on the app, I would read every message to check and go back to the text,” Sood said. “There is work to be done there but . . . for me personally, it was a big change.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *