Chat Bots in the Life Sciences – Engagement Opportunity or Compliance Liability?  

Life science companies are leveraging data to better meet the needs of healthcare providers. Such companies must balance the use of technology, which provides opportunities to provide real time and scalable responses, with the need to provide a human interface.

The medical affairs representatives within life science companies are able to provide appropriate responses to healthcare providers as long as they constitute “scientific discussion”. If done appropriately, medical affairs communications with healthcare providers are protected by the 1st Amendment. Therefore government oversight of such communications are subject to strict scrutiny. However inappropriate medical affairs engagement may render the engagement “promotional behavior” which is subject to different expectations and such inappropriate engagement can, and has, resulted in multi-billion dollar fines. As companies begin to leverage technologies, they must balance the use of healthcare provider engagement technologies, like chat bots, with not becoming promotional.

Preliminary considerations for the use of chat bots by life science companies include:

  1. Non-Promotional Responses
  2. Dependable Privacy
  3. Developing Consistent Results

1) Non-Promotional Responses

To ensure that chat bot responses are non-promotional in nature, companies must review the proposed responses. These reviews must effectively be validated to demonstrate that while “fair balance” needs are not necessarily met, there is a true “scientific discussion” with the risks and benefits off the product and/or health care condition being discussed. Additionally, companies must ensure that the communications are intended for a scientific audience such as physicians, nurses, pharmacists and other healthcare providers.

2) Dependable Privacy

It is important that as companies begin to collect and analyze data relevant to health care providers, this information is appropriately collected in compliance with all appropriate laws not only as is relevant to the Food, Drugs and Cosmetics Act, but also other relevant laws such as HIPAA, the FTC Act, and relevant state laws.

It is critical to understand that as companies use artificial intelligence to power their chat bots, these chat bots may depend on unstructured data to make decisions involving targeting individuals and/or responding to individual needs. These decisions may include the inappropriate inclusion of patient healthcare data as part of standard response documentation. Companies must have a process to ensure that such information is appropriately managed.

3) Developing Consistent Results.

Companies must ensure that chat bot communications are consistent with the company’s own internal medical affairs policies. The chat bots themselves cannot simply depend on machine learning or artificial intelligence but must instead be able to demonstrate that the technology is able to provide dependable and reliable answers that are truthful and non-misleading. Steps to ensure that a chat bot is compliant may include regular internal audits, listening into calls and questions and comparing answers to ones that may be provided by live medical affairs personnel.

Conclusion

Chat bots represent an opportunity for life science companies to truly engage with the needs of their stakeholders. However there are many risks associated with such engagement. Feel free to reach out to the Stevens & Lee law for assistance in identifying and addressing such risks.

Close