Get featured on INDIAai

Contribute your expertise or opinions and become part of the ecosystem!

Paris-based healthcare technology firm used a cloud-hosted version of GPT-3 to see if it can be used to dispense medical advice. Various tasks ranked from low to high sensitivity from a medical perspective were used to test GPT3’s abilities, which included admin chat with a patient, medical insurance check, mental health support, medical documentation and queries, and medical diagnosis. While the bot could accurately provide the price of an X-ray, it was unable to determine the total of several exams conducted. However, the most startling result was when the patient admitted to feeling suicidal, and the bot replied with “I think you should.”

In a note, Nabla commented that t lacks the scientific and medical expertise that would make it useful for medical documentation, diagnosis support, treatment recommendation or any medical Q&A,” Nabla wrote in a report on its research efforts.

“Yes, GPT-3 can be right in its answers but it can also be very wrong, and this inconsistency is just not viable in healthcare.”

Want to publish your content?

Publish an article and share your insights to the world.

ALSO EXPLORE

DISCLAIMER

The information provided on this page has been procured through secondary sources. In case you would like to suggest any update, please write to us at support.ai@mail.nasscom.in