By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
Baner ClubBaner ClubBaner Club
  • Home
  • Business
    BusinessShow More
    Last year, Pakistan imported food valued at a record $8.14 billion.
    July 25, 2025
    Gold prices decrease massively for second day
    July 25, 2025
    Weekly inflation increases 4.07%: PBS
    July 25, 2025
    Bank account opening becomes much easier. Here’s how!
    July 25, 2025
    Meeting with Field Marshal Munir, business executives talk about the state of the economy.
    July 24, 2025
  • Politics
    PoliticsShow More
    Shibli Faraz’s arrest warrant in the Islamabad Police Attack case is non-bailable.
    July 25, 2025
    By 2026, BYD intends to introduce local assembly in Pakistan.
    July 25, 2025
    PTI’s Hammad Azhar to surrender after 2 years in hiding
    July 25, 2025
    PML-N son of former PTI minister Nawabzada Mohsin Ali
    July 25, 2025
    Pakistan, US vow to boost ties with sustained diplomatic engagement
    July 25, 2025
  • Technology
    TechnologyShow More
    Intel declares a 20% workforce reduction as part of a strategic reorganization.
    July 25, 2025
    Starlink network suffers rare global outage
    July 25, 2025
    China starts building world’s largest hydropower dam in Tibet
    July 22, 2025
    Making life on the moon possible by extracting water
    July 18, 2025
    YouTube updates monetisation policy; Disclosing AI usage is now mandatory
    July 17, 2025
  • Sports
    SportsShow More
    Joe Root becomes third-highest run-scorer in Tests
    July 25, 2025
    Undertaker all praise for late Hulk Hogan
    July 25, 2025
    Indian pacer Yash booked over rape allegations
    July 25, 2025
    New Zealand spinner Sodhi relishing T20 powerplay challenge
    July 25, 2025
    PSB moves ahead with action against Pakistan Netball Federation
    July 25, 2025
  • Matrimonial
  • Marketplace
  • Chat Room
  • EN
    • AR
    • EN
    • EO
    • FR
    • UR
Reading: ChatGPT found by study to spread inaccuracies when answering medication questions
Share
Notification
Font ResizerAa
Font ResizerAa
Baner ClubBaner Club
Search
  • Home
  • Business
  • Politics
  • Technology
  • Sports
  • Matrimonial
  • Marketplace
  • Chat Room
  • Privacy Policy
  • Terms and Conditions
© 2022 BanerClub. All Rights Reserved.
Baner Club > Blog > Information Technology > ChatGPT found by study to spread inaccuracies when answering medication questions
Information TechnologyInnovationTechnology

ChatGPT found by study to spread inaccuracies when answering medication questions

ARTIFICIAL INTELLIGENCE

Last updated: 2023/12/20 at 6:52 PM
Published December 15, 2023
Share
9 Min Read
SHARE

ChatGPT has been found to have shared inaccurate information regarding drug usage, according to new research.

Contents
Usage policy by ChatGPT‘Innovative potential’

In a study led by Long Island University (LIU) in Brooklyn, New York, nearly 75% of drug-related, pharmacist-reviewed responses from the generative AI chatbot were found to be incomplete or wrong.

In some cases, ChatGPT, which was developed by OpenAI in San Francisco and released in late 2022, provided “inaccurate responses that could endanger patients,” the American Society of Health System Pharmacists (ASHP), headquartered in Bethesda, Maryland, stated in a press release.

WHAT IS ARTIFICIAL INTELLIGENCE?

ChatGPT also generated “fake citations” when asked to cite references to support some responses, the same study also found.

Along with her team, lead study author Sara Grossman, PharmD, associate professor of pharmacy practice at LIU, asked the AI chatbot real questions that were originally posed to LIU’s College of Pharmacy drug information service between 2022 and 2023.

ChatGPT, the AI chatbot created by OpenAI, generated inaccurate responses about medications, a new study has found. The company itself previously said that “OpenAI’s models are not fine-tuned to provide medical information. You should never use our models to provide diagnostic or treatment services for serious medical conditions.

Of the 39 questions posed to ChatGPT, only 10 responses were deemed “satisfactory,” according to the research team’s criteria.

The study findings were presented at ASHP’s Midyear Clinical Meeting from Dec. 3 to Dec. 7 in Anaheim, California.

Grossman, the lead author, shared her initial reaction to the study’s findings with Fox News Digital.

BREAST CANCER BREAKTHROUGH: AI PREDICTS A THIRD OF CASES PRIOR TO DIAGNOSIS IN MAMMOGRAPHY STUDY

Since “we had not used ChatGPT previously, we were surprised by ChatGPT’s ability to provide quite a bit of background information about the medication and/or disease state relevant to the question within a matter of seconds,” she said via email. 

“Despite that, ChatGPT did not generate accurate and/or complete responses that directly addressed most questions.”

Grossman also mentioned her surprise that ChatGPT was able to generate “fabricated references to support the information provided.”

chatgpt and pill bottle

Out of 39 questions posed to ChatGPT, only 10 of the responses were deemed “satisfactory” according to the research team’s criteria.

In one example she cited from the study, ChatGPT was asked if “a drug interaction exists between Paxlovid, an antiviral medication used as a treatment for COVID-19, and verapamil, a medication used to lower blood pressure.”

HEAD OF GOOGLE BARD BELIEVES AI CAN HELP IMPROVE COMMUNICATION AND COMPASSION: ‘REALLY REMARKABLE’

The AI model responded that no interactions had been reported with this combination.

But in reality, Grossman said, the two drugs pose a potential threat of “excessive lowering of blood pressure” when combined.

“Without knowledge of this interaction, a patient may suffer from an unwanted and preventable side effect,” she warned.

“It is always important to consult with health care professionals before using information that is generated by computers.”

ChatGPT should not be considered an “authoritative source of medication-related information,” Grossman emphasized.

“Anyone who uses ChatGPT should make sure to verify information obtained from trusted sources — namely pharmacists, physicians or other health care providers,” Grossman added.

MILITARY MENTAL HEALTH IN FOCUS AS AI TRAINING SIMULATES REAL CONVERSATIONS TO HELP PREVENT VETERAN SUICIDE

The LIU study did not evaluate the responses of other generative AI platforms, Grossman pointed out — so there isn’t any data on how other AI models would perform under the same condition.

“Regardless, it is always important to consult with health care professionals before using information that is generated by computers, which are not familiar with a patient’s specific needs,” she said.

Usage policy by ChatGPT

Fox News Digital reached out to OpenAI, the developer of ChatGPT, for comment on the new study.

OpenAI has a usage policy that disallows use for medical instruction, a company spokesperson previously told Fox News Digital in a statement.

Paxlovid medication

Paxlovid, Pfizer’s antiviral medication to treat COVID-19, is displayed in this picture illustration taken on Oct. 7, 2022. When ChatGPT was asked if a drug interaction exists between Paxlovid and verapamil, the chatbot answered incorrectly, a new study reported.

“OpenAI’s models are not fine-tuned to provide medical information. You should never use our models to provide diagnostic or treatment services for serious medical conditions,” the company spokesperson stated earlier this year. 

“OpenAI’s platforms should not be used to triage or manage life-threatening issues that need immediate attention.”

Health care providers “must provide a disclaimer to users informing them that AI is being used and of its potential limitations.” 

The company also requires that when using ChatGPT to interface with patients, health care providers “must provide a disclaimer to users informing them that AI is being used and of its potential limitations.” 

In addition, as Fox News Digital previously noted, one big caveat is that ChatGPT’s source of data is the internet — and there is plenty of misinformation on the web, as most people are aware. 

That’s why the chatbot’s responses, however convincing they may sound, should always be vetted by a doctor.

Woman sick at pharmacy

The new study’s author suggested consulting with a health care professional before relying on generative AI for medical inquiries.

Additionally, ChatGPT was only “trained” on data up to September 2021, according to multiple sources. While it can increase its knowledge over time, it has limitations in terms of serving up more recent information.

Last month, CEO Sam Altman reportedly announced that OpenAI’s ChatGPT had gotten an upgrade — and would soon be trained on data up to April 2023.

‘Innovative potential’

Dr. Harvey Castro, a Dallas, Texas-based board-certified emergency medicine physician and national speaker on AI in health care, weighed in on the “innovative potential” that ChatGPT offers in the medical arena.

“For general inquiries, ChatGPT can provide quick, accessible information, potentially reducing the workload on health care professionals,” he told Fox News Digital.

ARTIFICIAL INTELLIGENCE HELPS DOCTORS PREDICT PATIENTS’ RISK OF DYING, STUDY FINDS: ‘SENSE OF URGENCY’

“ChatGPT’s machine learning algorithms allow it to improve over time, especially with proper reinforcement learning mechanisms,” he also said.

ChatGPT’s recently reported response inaccuracies, however, pose a “critical issue” with the program, the AI expert pointed out.

“This is particularly concerning in high-stakes fields like medicine,” Castro said.

doctor with ai ipad

A health tech expert noted that medical professionals are responsible for “guiding and critiquing” artificial intelligence models as they evolve.

placeholder

Another potential risk is that ChatGPT has been shown to “hallucinate” information — meaning it might generate plausible but false or unverified content, Castro warned. 

“This is dangerous in medical settings where accuracy is paramount,” said Castro.

“While ChatGPT shows promise in health care, its current limitations … underscore the need for cautious implementation.”

AI “currently lacks the deep, nuanced understanding of medical contexts” possessed by human health care professionals, Castro added.

“While ChatGPT shows promise in health care, its current limitations, particularly in handling drug-related queries, underscore the need for cautious implementation.”

artificial intelligence language model

OpenAI, the developer of ChatGPT, has a usage policy that disallows use for medical instruction, a company spokesperson told Fox News Digital earlier this year.

Speaking as an ER physician and AI health care consultant, Castro emphasized the “invaluable” role that medical professionals have in “guiding and critiquing this evolving technology.”

You Might Also Like

Intel declares a 20% workforce reduction as part of a strategic reorganization.

Starlink network suffers rare global outage

China starts building world’s largest hydropower dam in Tibet

Making life on the moon possible by extracting water

YouTube updates monetisation policy; Disclosing AI usage is now mandatory

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.

By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
osama abdullah December 20, 2023 December 15, 2023
Share This Article
Facebook Twitter Whatsapp Whatsapp LinkedIn Copy Link
Share
Previous Article Federal court reinstates case of ex-high school athletes challenging CT state laws on transgender athletes
Next Article Pakistan trail Australia by 355 in first Test after Jamal takes six on debut
Leave a comment Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

Facebook Like
Twitter Follow
Pinterest Pin
Instagram Follow
Youtube Subscribe
Dribbble Follow
- Advertisement -
Ad imageAd image

Latest News

Moscow airports temporarily closed after Ukraine drone attacks
War July 25, 2025
Protests in Ukraine as Zelensky signs bill targeting anti-corruption bodies
War July 25, 2025
Almost a third of people in Gaza not eating for days, UN food programme warns
War July 25, 2025
Storms and heavy rain during the most recent monsoon season have killed 14 people and injured 17.
Pakistan Weather July 25, 2025
//

Where headlines meet insight, and stories shape perspectives. Your gateway to informed perspectives and captivating narratives.

Top Categories

  • BUSINESS
  • POLITICS
  • TECHHot
  • HEALTH
  • News
  • Technology
  • Fashion

Quick Links

  • MY BOOKMARK
  • INTERESTSNew
  • CONTACT US
  • BLOG INDEX
Baner ClubBaner Club
Follow US
© 2022 BanerClub. All Rights Reserved.
  • Privacy Policy
  • Terms and Conditions
  • Login
  • Sign Up
Forgot Password?
Lost your password? Please enter your username or email address. You will receive a link to create a new password via email.
Welcome Back!

Sign in to your account

Register Lost your password?