Smart AI, smarter writers: Why MedComms needs both 

5 minute read

Published: July 10th, 2025

Anthea, Word Monster Team

In the changing world of MedComms, AI isn’t a future disruptor, it’s a present reality. From conference stages to client briefs, the conversation has shifted from if to how. But as agencies race to adopt new tools and streamline workflows, one truth remains: 

If anything is certain at the moment, it’s that the impacts of AI aren’t coming – they’re here already. From LinkedIn threads to conference panels, the MedComms world is full of conversations about AI, efficiencies and disruption. 

We’ve heard the same things throughout the MedComms industry, that clients are looking for more value from less resource. Pharma – and by extension the agencies they work with – are facing budget constraints. There is an ever-increasing focus on proving real-world value to clients, not just activity. 

Agencies are expected to be AI-literate and agile, always on top of new technological developments and becoming increasingly efficient and profitable. 

But in MedComms, speed isn’t our only goal. 

Accuracy and trust are the highest priority for healthcare communications. This is increasingly true with greater patient access to health data and diagnostic tests. Patients are becoming health consumers – ordering and interpreting their own health tests. But if they can’t understand the content, what’s the point? 

Poor communication can mislead people at the best of times, but within healthcare this can lead to real-world health outcomes. Patients can become alienated from their healthcare providers or even harmed by inaccurate information. 

Now, I don’t want to get down on AI as it has a huge range of uses in MedComms. People are using it across the industry to help with research, editing, summarising and much more. We have tools to power through repetitive or manual tasks (looking at you, slide deck abbreviations) and save our brain power for researching new products.  

But AI lacks context, empathy and accountability. Unlike a human medical writer, ChatGPT has no skin in the game. Corrections are met with ‘Yes, you’re absolutely right!’ and a change in thinking to match the information you have just shared. Good medical writers have deep therapy area knowledge and are willing to stand behind their work and challenge thinking.  

AI has been described as creating ‘information-shaped sentences’, not truth. Hallucination and inaccuracies are real risks, especially in regulated medical content. While convincing, AI outputs can be deceptive. 

If you’re not a subject matter expert, how would you know that the information provided by AI is wrong? 

To write accurate and persuasive materials, medical writers need to know: 

  • What’s true 
  • What’s critical 
  • What could go wrong if misunderstood 

Human expertise is critical for ensuring scientific accuracy, regulatory compliance, patient-centric storytelling and strategic relevance. 

Great medical writers don’t just write, they interpret, translate and guide patients through their journey. The health outcomes for these choices can be huge. Translating science into understanding matters more than ever when patient understanding is poor. 

Rebecca Pope shared a powerful example of an NHS Trust with a patient population who had an average reading age of 5–9, and many patients not speaking English as a first language. She asked, ‘How can a patient choose a treatment if they don’t understand?’. 

As scientific communicators, we need to take responsibility for content and make judgement calls on what to include and how to proceed. We are responsible for ensuring content is compliant, for example with the ABPI Code of Practice. We need to decide what is ethical, versus what is efficient. Do you want to leave these choices in the hands of an AI agent that ceases to exist when you clear its memory? 

At A Festival of #Medcomms, we saw how agencies across the industry are experimenting with AI. We heard about how pharma clients are sharing access to their internal AI environments with agency partners, putting agencies in the role of data translator. Writers will increasingly need to manage machines, not just humans. 

The best results come from those who play, test and question. Medical writers who build confidence through experimentation are best placed to understand the risks and rewards of AI, and it can be fun to play around with these tools. 

One thing holds true though – you should know your content before you present it, whether the material has come from your AI agent or a colleague. NN Group recommend working with AI like you would with an intern – providing instruction, guidance, oversight and correction. Ultimately, AI tools can’t vet themselves, and trusted medical writers are needed to take responsibility for content. 

Success for medical writers comes in the storytelling and clarity of the content. From patients to policymakers alike, stories drive changes in behaviour. 

Ideas don’t need to be complicated. Innovation is not rocket science! But ideas are only powerful when people understand them. 

So, what’s the takeaway? 

  • Use AI well
  • Use technology to help you communicate even better than before 
  • Focus on the people, not the tools 

We can all agree that AI is here to stay, but in MedComms it’s up to us to guide the message. 

At Word Monster, we sit at the sweet spot, leveraging AI tools to create efficiencies, automate repetitive tasks and innovate with structured data. This saves precious time for our scientific experts to focus on creating intuitive messaging, leveraging the innovative formats, so the science can be understood by everyone. 

Let’s talk about how we’re using AI to enhance our work – drop us a message or visit our virtual office space to say hi. 

Contact us