AI is transforming the world of work, but what does this mean for the future of the tax function?

Build trust through AI to power tax compliance

  • Insight
  • 5 minute read
  • September 12, 2024

Discover how you can revolutionise the tax function to help build trust with stakeholders through AI.

AI is transforming the world of work. It’s a profound change, one that many think will be as significant as the industrial revolution. And the impact is already being felt. PwC’s 2024 AI Jobs Barometer found that the sectors of the economy with the greatest AI exposure are experiencing nearly five times faster productivity growth. And jobs in tax and compliance are set to see an outsized impact.

Business leaders recognise this. Our latest Global CEO Survey found that a majority of CEOs — over two-thirds — think generative AI will drive positive changes in the near term. Not only in operational areas like productivity and process reinvention, but also in driving broader business value through real-time insights and better decision making.

What does this mean for the future of the tax function? AI is increasingly allowing tax leaders to deliver higher-quality work, reduced risk, streamlined communication, and greater situational awareness through improved horizon scanning. And, of course, it’s also revolutionising data insight. Imagine being able to ask any question of your data, in any format, from any part of the world, and get an instant answer. 

That’s a really exciting prospect for any tax function, and it’s fast becoming a reality.

Trust: The critical enabler

To capitalise, however, one critical ingredient is needed: trust. That means enabling trust in the data, trust in AI’s outcomes, and trust in the quality of reporting to senior stakeholders across the business, as well as maintaining the trust of external stakeholders, from investors to tax authorities.

Trust is a combination of competence and taking a responsible approach to AI adoption. Competence refers to the ability to deliver on an organisation's promises, reliably and effectively. Taking a responsible approach consists of analysing the risks around infusing AI into business, whilst keeping your stakeholders informed around your organisation's use of AI and data, how it’s developed and deployed, how it’s monitored and governed and whether it’s providing the value they expect. You not only need to be ready to provide the answers, but you must also demonstrate ongoing legal and regulatory compliance. Together, these elements form the foundation of trust, which is essential for building and maintaining strong relationships with stakeholders. Trust is not just about compliance with laws and regulations but also about the broader perception of an organisation's behaviour and values. Building and sustaining trust requires consistent actions that align with stated principles and commitments.

There’s no question tax and compliance leaders are managing high levels of uncertainty today. Not only around AI’s evolution, but also in anticipating regulatory change such as Pillar Two and new Sustainability reporting. They’re also dealing with ongoing macroeconomic pressures and managing an increasingly dynamic work environment.

The vision: An AI-enabled tax compliance function

It’s clear AI will be a critical part of the response. Some of the most interesting opportunities in the tax function include:

  • Training Large language models (LLMs) on tax-related data with contextual knowledge to easily get to the underlying issue and its complexity, and to identify which aspects must be researched further. 

  • Using next-level data summarisation and real-time data analysis to improve decision making.

  • Enabling automated data gathering and real-time cleansing to reduce manual effort.

  • Introducing AI-powered alerting, verification and data-matching to improve quality and consistency.

  • Allowing real-time horizon scanning, risk analysis, visualisation and benchmarking.

Of course, using AI safely has become paramount. But innovations like these are increasingly necessary. Our most recent EMEA-led Future of Tax Survey found that more than half of tax leaders (57 per cent) expect to see an increase in their tax reporting and compliance responsibilities in the next two to three years. Over two-thirds (69 per cent) said a greater use of technology would be needed as a result, while almost as many (63 per cent) said the same about staff upskilling.

In the Tech, Media and Telecoms space we see AI-enabled tax compliance functions providing enhanced data accuracy, and predictive insights. These organisations typically generate revenue from diverse sources, encountering intricate compliance challenges relating to content and data usage. AI can play a key role in overcoming these compliance challenges by automatically analysing data patterns and identifying risks, creating better transparency, more trusted reporting, and enabling these organisations to stay one step ahead of complex compliance regulations.

Mary Shelton RoseLeader of Industry for Technology, Media and Telecommunications, Partner, PwC UK

The catch? Capturing the full value of AI isn’t easy. Business leaders may start their journey of AI adoption through off-the-shelf generic tools, such as Copilot and ChatGPT, to explore and harvest the productivity gains that can be achieved, but need to keep in mind that these models in isolation can only take them so far. Also, leaders need to tread with caution, as there are questions over the lack of control over how a company's data and intellectual property is being used by LLMs. Information entered in a LLM is visible to its developers and is often used to train the model. This is particularly significant for tax advisors who are privy to sensitive data. At PwC, we recently launched a tax AI assistant tool with our strategic alliance partners, Harvey and OpenAI, trained on case law, legislation and other underlying sources. Together with our own IP, and the data being regularly refreshed to reflect changes and updates to tax rules, the model generates significantly higher quality and accuracy in the tax domain when compared to publicly available LLMs and provides references to underlying data, allowing for transparent and accurate validation by tax professionals.

Beyond this, it is also important to understand the inherent bias in the technology that is used. To overcome the challenges above, leaders need to create and define the necessary guardrails to ensure the responsible use of AI, alongside an infrastructure that pertains to safe and trustworthy data, and the question of whether to ‘buy or build’.

As organisations become more acquainted with the nuances of AI tools and how they can co-exist together to create business value, leaders should begin to map out the business case for bespoke solutions which are needed to unlock the true power of AI, to take their business to the next level. And the ability to choose different models for different use cases will be equally important. But this is hard for individual organisations to do on their own. Especially given the many other priorities that tax leaders face today. It’s why having a trusted environment that provides access to proven tax-related third-party AI tools and models is increasingly essential. 

The journey: Five essential considerations

Here are five of the most important considerations for developing trusted AI in your tax function.

Creating a culture of trust. Trust begins with governance and culture. On the one hand, it’s about establishing clear guidelines and responsible AI standards, ensuring everyone understands the risks and knows where the red lines are. On the other, it’s a question of fostering a culture of responsibility and trust around AI — which needs to be driven from the top.

Trusting your data. Poor quality data leads to poor AI outcomes — and a potentially catastrophic loss of trust. A robust, secure, organised, comprehensive and trusted data strategy is therefore critical. And it needs to be seen not as a technical compliance-oriented task, but rather as an opportunity to build a foundation for future value creation. This is a topic we explored in our recent data strategy article

Trusting AI’s outputs. The risks of inaccurate AI ‘hallucinations’ are well documented. The best way to manage this is through a ‘human-led, tech-powered’ approach. It means building in checks and balances to allow experienced professionals to guide AI inputs, review outputs, make the key decisions and perform the important work. Examples include using ‘prompt engineering’ to improve the information supplied to AI models. 

Trusting in people. People, ultimately, are at the heart of any AI transformation. But they need the right skills to use AI in a trusted way. Indeed, our own AI-powered transformation at PwC underscores the importance of upskilling and change management. And we’re not alone: our 2024 Global CEO Survey found 69% of business leaders think AI will require most of their workforce to develop new skills. In fact, the evidence for a skills-led transformation is mounting. PwC’s AI Jobs Barometer found that the occupations that are most ‘AI-exposed’ will see as much as a 25 percent change in the skills required, increasing the pressure on workers to upskill to stay relevant. 

Building trust for stakeholders. Although transparency doesn’t in itself replace the need for trust, tax and compliance leaders can go a long way to building trust with internal and external stakeholders by being transparent about how AI systems operate, making their decision-making processes understandable, demonstrating how data is safeguarded, and so on. 

AI you can trust

As tax compliance leaders continue to navigate rapid change, the ability to leverage AI will be an increasingly essential capability. But this is more than a question of technology adoption. It’s about building and maintaining trust with all their stakeholders — employees, investors and regulators. That trust is the key that will unlock AI’s true long-term value. We have started this journey at PwC and are excited about the transformational nature of GenAI and putting the technology directly in the hands of our people. We can share our learnings with you and make our AI investments available to you so that you can accelerate your own journey to leveraging trusted AI to transform your compliance. 

Authors

Jonathan Howe

Jonathan Howe, Global Connected Tax Compliance Leader, PwC United Kingdom

Stan Berings

Stan Berings, Tax Partner, EMEA Connected Tax Compliance Leader, PwC Netherlands

Mary Shelton Rose

Mary Shelton Rose, Leader of Industry for Technology, Media and Telecommunications, PwC United Kingdom

Marcel Jakobsen

Marcel Jakobsen, Chief Technology Officer - Global TLS, PwC Netherlands

Connect with our team

Responsible AI at PwC

Follow us