Study shows doubts:

Patients don’t want medical advice from AI

Nachrichten
01.08.2024 12:25

A pulling sensation in the stomach, a persistent cough or a strange spot on the toenail: people consulting Google when they have symptoms is nothing new. AI chatbots now promise even more possibilities for self-diagnosis - or so you might think. But there are reservations.

With the increasing popularity of AI chatbots such as ChatGPT, the possibilities for digital self-diagnosis seem to have grown, one might think. In fact, a study shows that the medical expertise of artificial intelligence is still met with great reservations.

Würzburg scientists investigated the reaction of people to AI-generated medical advice. "We were not interested in the technical competence of AI, but solely in the question of how the AI output is perceived," said Moritz Reis from Julius-Maximilians-Universität on the study published in the journal "Nature Medicine".

Study with more than 2000 test subjects
The research team divided more than 2,000 test subjects into three groups who received identical medical advice. The first group was told that the recommendations came from a doctor. In the second group, an AI-based chatbot was named as the originator and the third group assumed that the advice came from a chatbot but had been checked again by a doctor.

The test subjects assessed the recommendations for reliability, comprehensibility and empathy. As soon as they suspected that an AI was involved, they perceived the advice as less empathetic and reliable. This was also true for the group who believed that a doctor had reviewed the AI recommendations. Accordingly, they were less willing to follow these recommendations. "The effect of bias against AI is not huge, but it is statistically significant," commented Reis.

The cognitive psychologist partly explains the AI scepticism with stereotypes: "Many people believe that a machine cannot be empathetic." In terms of comprehensibility, however, all three groups rated the advice the same.

AI scepticism is highly relevant for doctors
For the research group, the AI scepticism identified is important, as AI is playing an increasingly important role in medicine. Numerous studies on new AI applications are currently being published. This makes public acceptance all the more important, says Reis: "The question of the future use of AI in medicine is not just about what is technically possible, but also about how far patients will go with it." Education about relevant applications and AI in general is necessary. "In addition, other studies have shown how important it is for patient trust that, in the end, the human doctor always has the final decision-making power together with the patient," Reis emphasized.

The scientist considers transparency to be particularly relevant: "This means, for example, that an AI not only makes a diagnosis, but also explains in a comprehensible manner what information led to this result."

Diagnosis hit rate for chatbots varies
The quality of these results has been scientifically investigated for some time - with varying degrees of success. For example, a study published in the Journal of Medical Internet Research in 2023 attested to the high diagnostic accuracy of ChatGPT: tested with 36 case studies, the chatbot made the correct final diagnosis in almost 77% of cases. According to a Dutch study, the diagnostic competence in emergency rooms even came close to that of doctors. Equipped with anonymized data from 30 patients treated in a Dutch first aid center, ChatGPT made the correct diagnosis in 97 percent of cases (Annals of Emergency Medicine, 2023).

In contrast, a study published in the journal "Jama" in 2023 found that the chatbot correctly diagnosed only 27 cases out of 70 medical case studies. That's just 39 percent. A study presented in the journal "Jama Pediatrics" came to the conclusion that this hit rate is even worse for diseases that primarily affect children.

A recent study published in the journal "Plos One" has now investigated whether ChatGPT could be useful in medical training. According to the research team from Canada's London Health Sciences Centre, the chatbot not only has access to a huge knowledge base, but is also able to communicate this knowledge interactively and comprehensibly.

Opportunities in medical education
The group fed ChatGPT with 150 so-called case challenges from a database of medical case histories describing symptoms and disease progression. Both trainee doctors and doctors already in the profession are asked to make a diagnosis and develop a treatment plan in an answer-choice process.

ChatGPT was correct in just under half of the cases (74 out of 150) in this test. The study found that ChatGPT had difficulties in interpreting laboratory values and imaging procedures and overlooked important information. Accordingly, the authors conclude that ChatGPT in its current form is not accurate as a diagnostic tool and that caution should be exercised when using the chatbot as a diagnostic tool as well as a teaching aid.

Zitat Icon

The combination of high relevance and relatively low accuracy argues against relying on ChatGPT for medical advice.

Studienergebnis des London Health Sciences Centre

"The combination of high relevance and relatively low accuracy argues against relying on ChatGPT for medical advice, as it can present important information that may be misleading," the study states - a warning that most likely also applies to medical laypersons who use the chatbot for digital self-diagnosis.

ChatGPT itself emphasizes that it is not suitable for this purpose. When asked about its diagnostic qualifications, the bot replies: "I am not a doctor and have no medical training. I can provide information on medical topics, give general advice and answer questions, but I cannot make medical diagnoses or offer professional medical advice. If you have health problems or questions, you should always consult a doctor or a qualified healthcare provider."

This article has been automatically translated,
read the original article here.

Loading...
00:00 / 00:00
Abspielen
Schließen
Aufklappen
Loading...
Vorige 10 Sekunden
Zum Vorigen Wechseln
Abspielen
Zum Nächsten Wechseln
Nächste 10 Sekunden
00:00
00:00
1.0x Geschwindigkeit
Loading
Kommentare
Eingeloggt als 
Nicht der richtige User? Logout

Willkommen in unserer Community! Eingehende Beiträge werden geprüft und anschließend veröffentlicht. Bitte achten Sie auf Einhaltung unserer Netiquette und AGB. Für ausführliche Diskussionen steht Ihnen ebenso das krone.at-Forum zur Verfügung. Hier können Sie das Community-Team via unserer Melde- und Abhilfestelle kontaktieren.

User-Beiträge geben nicht notwendigerweise die Meinung des Betreibers/der Redaktion bzw. von Krone Multimedia (KMM) wieder. In diesem Sinne distanziert sich die Redaktion/der Betreiber von den Inhalten in diesem Diskussionsforum. KMM behält sich insbesondere vor, gegen geltendes Recht verstoßende, den guten Sitten oder der Netiquette widersprechende bzw. dem Ansehen von KMM zuwiderlaufende Beiträge zu löschen, diesbezüglichen Schadenersatz gegenüber dem betreffenden User geltend zu machen, die Nutzer-Daten zu Zwecken der Rechtsverfolgung zu verwenden und strafrechtlich relevante Beiträge zur Anzeige zu bringen (siehe auch AGB). Hier können Sie das Community-Team via unserer Melde- und Abhilfestelle kontaktieren.

Kostenlose Spiele
Vorteilswelt