From stress to strength:

Four key steps to prepare for the EU AI Act

From stress to strength: four key steps to prepare for the EU AI Act
  • July 15, 2024

The EU AI Act was published in the Official Journal of the European Union on 12 July 2024, with certain requirements starting to apply as soon as January 2025. 

Among the various obligations, the EU AI Act sets stringent rules around the development and adoption of Artificial Intelligence (AI). But from a practical perspective, what actions can businesses take from now as they prepare for the new AI regulation? 

The democratisation of Artificial Intelligence (AI) has made the technology available to an unprecedented number of individuals and businesses, pushing it beyond the exclusive reach of specialised researchers and big tech. In fact, 68% of CEOs in PwC’s 2024 Global CEO Survey agree that generative AI will increase competitive intensity in their respective industry by 2027.

Next three years



18%

Disagree

70%

Agree

Generative AI will significantly change the way my company creates, delivers and captures value.

18%

Disagree

69%

Agree

Generative AI will require most of my workforce to develop new skills.

19%

Disagree

68%

Agree

Generative AI will increase competitive intensity in my industry.

Note: Disagree is the sum of ‘slightly disagree,’ ‘moderately disagree’ and ‘strongly disagree’ responses; Agree is the sum of ‘slightly agree,’ ‘moderately agree’ and ‘strongly agree’ responses.

Source: PwC's 27th Annual Global CEO Survey


AI is significantly reinventing the service and product delivery capabilities of organisations, be it for medical diagnosis, financial fraud detection or customised customer service chatbots. As most business leaders aim to scale their AI adoption in the coming months to generate sustainable value, it will be essential for their in-house compliance teams and lawyers to understand the EU AI Act’s impact on their business.

Approaching AI responsibly

Adopting AI governance across all business functions enables adequate oversight on AI projects. A governance framework also enables the C-suite to make informed decisions on investment and scaling based on tangible metrics such as risk tolerance and complexity.

In many cases, a single AI technology can be applied in different use cases - with each of them posing new challenges for governance. With the recent EU AI Act, executives have an opportunity to innovate safely within regulatory guardrails and at the same time, establish a framework which can be adapted to different risk profiles.  

Here are four key steps to get started.

Man working remotely

Classifying AI Systems

Under the EU AI Act, AI systems are divided into various categories depending on the potential risks they may represent to the health, safety and fundamental rights of individuals. Developing an AI Inventory and classifying an AI system’s risks based on EU taxonomy can assist in-scope organisations in understanding the extent of their obligations under the law. 

In any case, a risk classification of AI systems can fast-track AI adoption by putting into focus priority areas where the business needs to take immediate action. Without a solid governance foundation, compliance teams may be unable to identify and mitigate the risks adequately.

How can we help?

At PwC Malta, our Privacy & Data team has the resources to help your organisation address compliance challenges in its AI adoption journey. For more information on how we can help, please reach out to our sector leaders below.