Why centralized data access is key for your organization becoming 'GDPR ready'

This content, written by Daniel Mintz, was initially posted in Looker Blog on May 25, 2018. The content is subject to limited support.

After the better part of two years of preparation, debate and conjecture across the technology industry, today, the general data protection regulation (GDPR) is finally upon us.

In the past, the impact of this type of regulatory change would have been confined to the IT and data teams. Nowadays, however, nearly everyone handles data. From customer communications to employee records and beyond, much of this information will qualify as personal data. This means, according to the GDPR, that data must be controlled, used based on published commitments, secured, and ‘deletable’.

Yet, for many companies, allowing access to data has typically required copying, exporting, and extracting data – which leaves a trail of personal data across any number of laptops, servers and systems, both inside the company and with third parties.

Tackling data sprawl

Once data is disconnected from the central source, people begin to rely on the types of decentralized storage “systems” mentioned above. “Oh, I have that list of email addresses on my laptop.” They’re then left with disparate data ‘swamps’ that are impossible to search and even harder to manage and protect.

From the perspective of IT, it’s one thing to control one highly guarded fortress. It’s another challenge entirely when you don’t know how many fortresses exist, what data is inside, how it’s used, you have no record of how many keys have been copied and you don’t know who has access to them. This is the challenge Chief Privacy and Data Protection Officers are being presented with. It’s a problem we need to tackle as an industry – or many will fall victim to GDPR and its potentially severe punishments, or a loss of customer trust.

This is an issue that requires a long-term solution – and cannot be solved by a one-time, CIO-led data swamp cleanup. Because if the data analysis tools encourage “data sprawl” -- extracting data and moving it to ‘data workbooks’ for analysis -- the problem will reoccur. So even after CIOs and IT teams have transformed their data swamps into clean and organized data lakes, their analysis tools go and start the problem all over again – creating a never-ending spiral of pain.

Why you need a single access point for your data

That’s why any long-term solution has to address the root of the problem. Businesses need a single access point for their data. They need to see who has accessed it and what they’ve done, all in one centralized, managed, secure place.

Introducing this kind of system immediately cuts down the number of steps required to start examining data and delving into whether it’s actually useful or not. Analytics can happen faster, and without encouraging data sprawl. Additionally, platforms leverage the world-class security of today’s most advanced databases, giving administrators control over and insight into who’s accessing data and how long it’s cached for.

The role of centralization

Looker is a centralized flexible data platform that leaves your data in your database. This means that employees no longer need to extract the data to analyze it. They can interpret it and act on it directly, accessing only the data they need to answer their immediate questions, while still retaining the ability to ask more.

This means the development of a long-term data governance and analysis strategy, in which analysts can still provide their organization with game-changing business insights - while maintaining compliance with regulation - becomes possible. An easier process. Cleaner data. And GDPR ready. That’s the modern approach to analytics your data-led business should consider.

Version history
Last update:
‎03-27-2022 11:18 PM
Updated by: