Many enterprises use chat widgets on marketing home pages. Often these widgets are installed without appropriate security or risk mitgation processes in place. This creates an additional data risk for enterprises.
People overshare with chat widgets. Even simple prompts can result in replies from users that are loaded with highly sensitive personal data. This data is ingested and stored, often in the servers of third party data processors.
There is a common belief that as long as this data is not processed then its not necessarily a problem. And whilst storing conversational data is a risk, if that risk doesn't occur technically we are compliant.
But this is a misunderstanding. The GDPR is not violated when a data breach occurs, but through inadequate security measures. It's just that the inadequacy usually only becomes apparent through a breach.
Data controllers are required to take a risk-based approach to data protection (see in particular Art 24, 25, 32). It's fine to do nothing to mitigate a risk because it's low-risk or because mitigation would be overly costly. But knowing that requires proper risk analysis. Other appropriate mitigation's must be used. This might include limiting who has access to chatbot data, controlling through which devices or networks the data can be accessed, having access logging, actually having a process to check those logs, having a procedure to revoke access, training persons with access to personal data to understand their responsibilities and obligations, having them sign NDAs, running exercises to test the robustness of these measures, and so on.
Alternatively, speak to us and we can handle this burden for you.