AMWA Blog

An Ethical Approach to Harnessing the Power of AI for Medical Writing

Written by American Medical Writers Association | September 23, 2024 at 1:00 PM

If artificial intelligence (AI) can read diagnostic tests, identify cancer cells, and construct cohorts for clinical trials, can it write medical journal articles and create regulatory documents? In other words, can AI be a medical writer?

No, it cannot. Primarily because most major medical journals reserve the task of authorship for humans because AI is a tool to augment, not substitute for, human intelligence.

This is not to say that AI can’t be helpful in medical writing. AI is a promising technology transforming entire industries, and medical communication is no exception. Generative AI, such as generative pre-trained transformers (GPTs) and chatbots, can be harnessed to enhance the efficiency of medical writers while connecting us to the vast storehouses of knowledge that exist.

It is possible to use AI’s tools responsibly and ethically in the service of medicine and science.

Defining AI

In a recent AMWA Journal article “Communicating About and With Artificial Intelligence Applications,” J. Kelly Byram, MS, MBA, ELS, reiterates a key point made about AI. 

It is important to assess whether an AI application is “reliable and secure.” Byram describes the Turing test of machine intelligence, a decades-old test designed to determine if a computer can “think” like a human. “AI is defined many ways, but 2 particularly salient definitions are (1) a machine performing a task requiring human intelligence and (2) a machine replicating human intelligence,” she writes.

For medical communicators new to the world of AI, a few definitions are in order.

  • Machine learning (ML) provides a foundation for many of the AI applications medical communicators encounter in their work. Machine learning is a model that builds on prior knowledge, learns from statistics, and changes as it processes data.
  • Deep learning (DL) is a kind of machine learning that resembles the human brain. “It learns from itself and can create new features on its own,” writes Byram. “I can learn nonlinear, high-dimensional relationships from data that are not just unstructured but multimodal.” The general public is wary of AI because of a lack of transparency when DL creates its own algorithms.
  • Weak AI is systems—like generative pre-trained transformers (GPTs)—that power search engines, Siri, Alexa, and spam filters.
  • Strong AI is the stuff of science fiction, sentient and flexible intelligence that can reason.
  • Large language models (LLMs) include tools like GPTs to help authors with grammar, language choice, references, summaries, and more. 
  • GPTs are applications like ChatGPT, Dall-E, and custom applications that are trained on the internet and can generate human-like language based on prompts.

Transparent Use of AI

For medical communicators, one of the main things to keep in mind when using AI tools is that consumer GPTs and custom models are trained on the internet, which is rife with inaccuracies. GPTs cannot distinguish fact from fiction, and they have been known to manufacture statistics and citations. 

AI tools can be enormously helpful in medical writing, yet medical communicators will always have the responsibility of checking the accuracy of information. This is why GPTs cannot be considered authors of medical or scientific articles.

Limitations of AI 

The clearest example of a limitation of AI is that it is not capable of meeting the requirements for authorship spelled out by the most respected publications and organizations such as the Journal of the American Medical Association (JAMA).

In addition, the International Committee of Medical Journal Editors (ICMJE) publishes a simple set of guidelines that clarify who can qualify for authorship. Its Recommendations for the Conduct, Reporting, Editing, and Publication of Scholarly Work in Medical Journals include the following 4 criteria:

  1. Substantial contributions to the conception or design of the work; or the acquisition, analysis, or interpretation of data for the work; AND
  2. Drafting the work or reviewing it critically for important intellectual content; AND
  3. Final approval of the version to be published; AND
  4. Agreement to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.
The accountability point is clear. Only a human can take responsibility for a document’s content. 

Disclosure and Confidentiality

Some AI tools, such as spelling and grammar checks and reference organizers, are in such common use that they do not require disclosure. However, if you are using AI to generate an outline, write a draft, or prepare a manuscript, disclosure requirements may apply. 

Publications may require authors to disclose prompts used to generate new text or convert text or images. The disclosure would require a description of the content created or edited, the name and version of the language model or tool, extension numbers, and manufacturer. 

Confidentiality is an important consideration when working with systems that gobble up information to build knowledge. Individual companies have policies that cover confidentiality and cybersecurity, and it’s important never to enter confidential or copyrighted information into a chat box.

Tips and Tools for Using AI

With the caveats described above, there are myriad ways that AI can enhance, simplify, and advance the work of medical communicators. Good prompts get good responses, and the more specific you are, the better results you will get. Prompts can include word limits, reading levels, style, and tone. Imagine asking a coworker to rewrite something 10 times. A GPT doesn’t mind because it has an infinite capacity to revise, and it can write for specific audiences. 

Because AI is becoming a part of the medical and scientific landscape, more resources and tools are being created every day to help harness the power of these systems while maintaining the accuracy and integrity of published information. 

AMWA’s AI Tip Sheet for Medical Writers is one example of a simple tool that medical writers can refer to in order to understand and use this rapidly evolving technology.

AMWA acknowledges the contributions of Jill Sellers, BSPharm, PharmD, RPh, for peer review in the development of this AMWA resource.