The EU's AI Act, adopted in March, was a pioneering step towards global AI regulation. The legislation aims to ensure that AI systems are safe, transparent, traceable and non-discriminatory.
While these are important parameters for a world-changing technology, the AI Act also placed substantial new operational burdens on chief privacy officers (CPOs) of companies everywhere.
CPOs have had to weather many changes in their job description in the last decade, as data regulation has evolved. The AI Act adds to that, by pushing CPOs to the forefront of the emerging field of AI governance. As such, a whole new set of technical skills, resources and authority are required.
The many hats of the CPO
We are way past the days of mere data protection. Now we must govern enterprise AI systems for transparency, fairness, copyright compliance and data security; educate ourselves on AI algorithms, machine learning models and automated decision-making systems; and consider the risks and ethical challenges that AI introduces. All this responsibility requires a fundamental rethinking of the CPO's role within an organization.
The impact of the AI Act on CPOs is already evident. In financial services, CPOs ensure AI credit-scoring models are unbiased and transparent. In healthcare, we need to monitor AI systems that process sensitive patient data. Even in e-commerce, which the AI Act does not consider “high-risk,” CPOs must assess recommendation algorithms for fairness and transparency.
This is not only a technical challenge, but an ethical and regulatory one. CPOs need a deep understanding of frameworks such as the European Commission’s Ethics Guidelines for Trustworthy AI. We must translate complex AI concepts, like federated learning and differential privacy, into actionable policies for other executives. We need to speak the language of data scientists, software engineers and product teams to ensure they're designing AI systems with privacy and ethics in mind.
Suffice it to say, it’s a tall order. Many CPOs lack the resources to meet the moment. A 2023 survey by the International Association of Privacy Professionals (IAPP) found that 56% of privacy professionals believed their organizations didn’t fully understand the benefits and risks of AI deployment.
What CPOs need in an AI world
To address this issue, enterprises must undertake a number of new tasks. These include investing in education that gives CPOs the skills they need to address AI’s technical, ethical and legal implications. More investment across industries will also empower universities and professional organizations to build out curricula on these topics for future generations.
Enterprises also need more financial and human resources. For many CPOs, existing teams are stretched thin by intersecting data compliance challenges. In 2023, nearly 70% of privacy teams reported insufficient budgets, and over six in 10 privacy professionals agreed that limited resources were impacting their ability to deliver on privacy goals. This year, 17% of organizations reported having just one team member tasked with AI governance duties. Hiring or budgeting for AI ethicists, data scientists with privacy experience, and legal experts specialising in AI regulation will put enterprises at a competitive advantage.
AI governance must sit at the core of enterprise strategy and operations. CPOs need a seat at the executive table to take part in key decisions about AI adoption and deployment from the earliest stages.
Failing to empower CPOs in this new era of AI governance will have severe consequences. Organizations risk hefty fines under the AI Act — up to €40 million, or 7% of worldwide annual turnover — not to mention reputational damage, loss of consumer trust, and competitive disadvantage.
The AI Act raised the bar for AI governance, setting a legislative landmark towards a more responsible, transparent and trustworthy AI ecosystem. CPOs are at the forefront of this transformation. Only those enterprises that invest in strong AI governance, with their CPOs at the helm, will thrive in the digital future.