Skip to content

BioGPT: a useful tool or cause for concern?


KEY TAKEAWAYS

  • BioGPT, a biomedical-specific generative AI tool, is pre-trained on millions of research articles and shows human parity in its generated answers.
  • The rapid development of AI and its potential applications within medical publishing hold huge promise, but a lack of regulation and guidance is causing some concern across the community.

As the development of generative artificial intelligence (AI) models, such as ChatGPT, continues apace, conversations are ongoing across the medical publishing community regarding the possible benefits and pitfalls of this technology. Microsoft’s biomedical-specific BioGPT, which generates text based on millions of published research articles, has huge potential, but many medical publications professionals remain cautious about its use and call for appropriate guidance to be established. In a recent article for Clinical Trials Arena, William Newton outlines the promise of BioGPT, along with the challenges that must first be overcome.

The promise of BioGPT

As reported by Luo et al in a recent preprint, pre-trained language models such as BioBERT have already displayed powerful abilities in discriminative downstream biomedical tasks, with text mining from existing literature playing essential roles in areas such as drug discovery and clinical therapy. However, these models are not generative and pre-trained GPT has the potential to vastly extend the utility of AI within the biomedical field. BioGPT, when evaluated against 6 biomedical natural language processing scales, including PubMedQA, outperforms other AI tools and exhibits human parity when answering biomedical questions.

As the take-up of GPT tools among authors, publishers, and even peer reviewers, continues to increase, the medical publishing industry must move quickly if it is to provide timely advice and regulation.

The challenges of using generative AI in medical publishing

The sophistication of BioGPT opens many interesting avenues within biomedical research, for instance, in drug development, digital biomarkers, and patient selections in clinical trials. However, like ChatGPT, BioGPT has several limitations. These include tendencies to generate inaccurate or misleading text and even perpetuate existing biases within scientific research.

The uncertain future

The inaccuracies within AI tools such as BioGPT present a growing concern among medical communications professionals alongside the rise of AI, with the issue featuring on the agenda of this year’s International Society for Medical Publication Professionals (ISMPP) Meeting. Many expressed optimism over AI’s potential within science communication. However, significant concerns surrounding potential user overreliance on AI tools and disclosure of AI use within manuscripts also arose. To ensure appropriate usage of AI within medical publishing, speakers called for guidance, including on the disclosure of AI tools and prompts used. As the take-up of GPT tools among authors, publishers, and even peer reviewers, continues to increase, the medical publishing industry must move quickly if it is to provide timely advice and regulation.

—————————————————–

Are you using generative AI in medical publications?

Never miss a post

Enter your email address below to follow our blog and receive new posts by email.

Never miss
a post

Enter your email address below to follow The Publication Plan and receive new posts by email.

We don’t spam! Read our privacy policy for more info.

Leave a Reply

%d bloggers like this: