Looking across the portfolios of top pharma companies, we’re approaching an interesting time for pharma strategy. Rich pipelines and diverse multi-indication medicines are setting the stage for a period of pharma that will be defined by maximising more for less.
Fundamental to these strategies will be the management of cost bases as the complexity and scope of operations increase – so how do we deliver the same quality and consistency of customer engagement without cost bases growing out of control?
Spending on marketing agencies and routine activities has not only grown but become so complex as to be inflexible to any potential restructuring. Marketing budgets encompass a range of activities, not all of which are fully understood, and the ability to adapt the changing market conditions or best practices in marketing capabilities is limited.
This might soon change, however. The last few years have brought a wave of developments in automation technology – from back-office operational developments such as process automation, through to the truly customer-adjacent technologies of voice assistants like Alexa and Siri.
Pharma has never been shy to talk about new sexy technologies (remember blockchain?), but so often the real-life developments of this tech fail to materialise even given time. So why would natural language generators such as ChatGPT be any different?
For one, its accessibility. In comparison to the complex nature of concepts such as blockchain, ChatGPT and tools such as DALL-E have brought generative technology to everyone from creative teenagers to experienced professionals. It’s proven that, despite extensive caveats about the accuracy and reliability of its outputs (which we’ll get to), it can be a powerful tool to begin making the most of our human resources.
Human creativity, robotic productivity
Every strategy from the last 10 years regarding the improvement of customer engagement and generation of superior experiences has spoken to the need of making things more human. Understanding human needs, making content more personal, and using empathetic understanding to drive strategy. So why are we now talking about automation?
Ultimately, between the human innovation process and the human packaging of content for customers, is a whole lot of extraneous and difficult work. Building materials, reviewing materials, combining and reconstructing content, and all of that hampered by the creator-approver back-and-forth most marketers are familiar with.
Furthermore, finding the right information, managing internal knowledge, and sifting through what’s available in large organisations also requires a lot of time and resources.
Step past the buzz of ChatGPT, around being able to answer contextual answers based on internet data up to 2021 only.
Think instead about piping in your own information base and using it to handle knowledge management, query triage, onboarding resources, and even to analyse trends and generate insight for data-driven decision-making.
In short, natural language-enabled chatbots can trim the fat from the way we work, especially if plugged into internal libraries. Humans still drive the processes, ask the questions, and filter the outputs – but with a powerful concierge handling the grunt work.
The buck still stops with us
Crucial in any discussion about AI, and especially ChatGPT, is a conversation about reliability. What chatbots are not, unless specifically designed to be so, are search engines. Many generative chatbots work on simulating real outputs, even creating false studies or statistics when asked to provide an intelligent-sounding answer.
In a quick example by one of our team, when asked to summarise the key points of a clinical study twice (to check consistency), the chatbot provided different drug names and diseases, and even changed patient numbers.
This can be managed with specific development, but it is important to watch out for in order to ensure these tools are not treated as silver bullets or omniscient content engines. Developed chatbots for specific purposes can operate with greater trust, for example, Google’s Med-PaLM, designed to handle medical queries, which has outperformed all other similar on medical exam questions but still only scored 85% accuracy on the US Medical Licensing Exam.
When all is said and done, these tools, however powerful, do not allow for the abdication of responsibility. HCPs and patients have a right to be treated responsibly and ethically – so we must ensure we have the right safeguards and processes to validate any automated outputs.
Making ChatGPT more than just a trend
Like all innovation, starting off with structured thinking is critical. With these generative tools being so readily accessible, it’s easy to make the mistake of diving straight in without understanding and exploring opportunities or what success could be defined as. As Lewis Carroll says, “If you don’t know where you’re going, any road will take you there”.
Furthermore, consider the limitations of these tools to work in a vacuum of context. Imagine a 10-year-old with all the knowledge of the internet – how would you with them to get the answers you require, and what guidelines and context might be needed to secure the right outputs?
The potential for setting these tools up to supplement and grow marketing operations with an efficiency contribution to cost bases is high, but the level of structure must be high, and the usage profile must be based on real needs from teams.
Are your teams struggling with being overwhelmed by queries and customer requests? Are you trying to solve the challenge of turnover and knowledge continuity? Or, quite simply, do you need to generate and turn around more content than you have the capacity for?
Start with the need, set out tests, and success criteria, and work iteratively to slowly phase in this new technology and you have a strong chance of making this buzzword technology a familiar presence in your organisation’s day-to-day.