

Discover more from DPOInsider
The DPOInsider covers the latest news and developments in data compliance and privacy. The DPO's favourite weekly read ☕️
Everything everywhere all at once data compliance
I found Terry Ray's article, The Blind Spots of Data-Regulation Compliance, to be an insightful and relevant exploration of the challenges data privacy professionals face in today's rapidly evolving regulatory landscape. With the proliferation of data-protection laws like the GDPR, CCPA, and PIPL, it's clear that businesses need to be more vigilant than ever in managing and protecting their data.
A key point Ray raises is the issue of prioritization in data protection. While well-intentioned, regulations that mandate specific protections for certain categories of data can inadvertently create security blind spots. Data that should be protected may be overlooked or misclassified, potentially leading to breaches or noncompliance. I believe this is a genuine concern for all data privacy professionals, and one we should be addressing proactively.
Ray also highlights the challenge of dealing with unstructured data, which IDC predicts will make up more than 90% of all data generated globally in 2023. This statistic underscores the need for organizations to invest in robust, scalable solutions that can effectively manage and categorize this vast trove of information. In my experience, finding the right tools to handle unstructured data has been an ongoing challenge.
The article concludes with a call to adopt an "Everything Everywhere All at Once" approach to data protection, which involves abandoning selective protections in favor of a more holistic strategy. By centralizing data management and using solutions that integrate with a variety of technologies and cloud platforms, organizations can improve regulatory compliance while reducing blind spots.
The ChatGPT regulatory drama continues
Well isn’t it interesting to follow the recent regulatory developments surrounding OpenAI's ChatGPT. Italy's ban of the generative text tool has raised concerns about data privacy and could signify the beginning of broader regulatory challenges for the AI giant. The Italian data regulator, Garante per la Protezione dei Dati Personali, has issued a temporary emergency decision, claiming that OpenAI lacks the legal right to use people's personal information in ChatGPT.
This action, the first taken against ChatGPT by a Western regulator, highlights privacy tensions around the creation of large generative AI models, which are often trained on massive amounts of internet data. European countries, such as France, Germany, and Ireland, are already showing interest in Italy's findings. Tobias Judin, the head of international at Norway's data protection authority, suggests that if a model is built on data that may be unlawfully collected, it raises questions about whether anyone can use the tools legally.
The situation also comes amidst increased scrutiny of large AI models, with tech leaders calling for a pause on the development of systems like ChatGPT. Europe's GDPR rules further complicate matters, as they protect the data of more than 400 million people across the continent. The Italian regulator has identified four issues with ChatGPT under GDPR, including the lack of age controls, providing potentially inaccurate information about people, and the absence of a legal basis for collecting people's personal information in its training data.
The Italian regulator's action could be the first of many cases examining OpenAI's data practices, and similar concerns may affect the development of machine learning and generative AI systems across the industry. It is crucial to address the lawful collection of training data for AI systems, as we approach a tipping point for this technology's widespread adoption.
Ultimately, this situation highlights the importance of incorporating data protection by design and default in AI systems. As privacy rules evolve, the ability to delete or correct information in AI systems will become increasingly important. Until then, it remains to be seen how GDPR and other privacy regulations will adapt to address the challenges posed by large language models like ChatGPT.
Other data privacy news
Children’s data compliance in the UK - not child’s play
On 4 April 2023, the UK data protection regulator (ICO) announced that it had fined TikTok 12.7 million GBP for misusing children’s data. This is the ICO’s third highest fine issued to date; apart from reflecting the additional considerations required for the processing of children’s data, this fine may also be indicative of further upcoming regulatory action regarding children’s data compliance.
Western Digital says hackers stole data in ‘network security’ breach
Data storage giant Western Digital has confirmed that hackers exfiltrated data from its systems during a “network security incident” last week.