spot_img

Date:

Share:

From Black Box to Glass Box: Sage and PwC commit to tackling AI trust gap in finance

  • New research by Sage and IDC shows seven in ten finance leaders will reject AI outputs they cannot explain, highlighting the need for more transparent, accountable AI.
  • Finance professionals are spending an average of 12.9 hours every week reconstructing, validating, and defending AI outputs, meaning adoption remains slow and cautious.
  • 26% of AI time savings are lost to verification, explanation, and reconstruction work, showing that a lack of transparency doesn’t remove labour – it shifts it into explanation work that slows scale.

Sage (FTSE: SGE), the leader in accounting, financial, HR and payroll technology for small and mid-sized businesses, today announced a new initiative in partnership with PwC, which will redefine how AI is built and adopted in finance, combining transparent, explainable AI with the governance and real-world expertise required to use it with confidence.

The initiative, “Beyond the Black Box”, was announced at Sage Future, and is backed by new research from Sage, conducted by IDC, showing that more than seventy percent of finance leaders (71%) would reject an AI system if it cannot explain its outputs, even if they are highly accurate, showing that trust, not technology, is holding back AI adoption.

“Finance does not run on answers alone – it runs on answers you can explain,” said Steve Hare, CEO, Sage. “If you cannot show how a number was produced, you cannot use it. That is why we are building AI differently. AI you can trust can’t be a black box, we see it as a glass box that gives finance teams full visibility into how it works, so they can stand behind it with confidence.’

Unlike previous AI initiatives that have focused on large enterprises or purely technical audiences, “Beyond the Black Box” was created with SMB realities at its core. It forms part of Sage’s commitment to helping more SMBs benefit from the transformative impact of AI, building upon the company’s Responsible AI framework and AI Trust Label, reinforcing the belief that trust must be built into AI from the outset.

Trust, not technology capability, is the biggest barrier to AI adoption in finance

As AI becomes more capable, the ability to explain and stand behind its outputs is emerging as the defining factor in whether it is trusted and adopted in finance.

The consequences are already measurable. Finance professionals are spending an average of 12.9 hours every week reconstructing, validating and defending AI outputs. Much of this work stems from the need to validate and explain outputs that do not clearly show how they were producedRather than removing overhead, opaque AI is creating a new category of it.

Sage describes this as the trust cost of AI – the gap between what AI systems promise in theory and what finance teams can actually rely on in practice. At its core, this is a transparency challenge. Every number, recommendation and AI-supported decision must be explainable to auditors, to boards, and to regulators. When it cannot be, adoption stalls.

From black box AI to glass box

Sage has designed its AI from the ground up for the realities of finance, where every output is transparent, explainable and accountable, so organisations can trust and act on it with confidence.

This represents a deliberate shift away from black box AI, where outputs are generated without visibility into how decisions are made, towards what Sage describes as glass box

AI: customers can meaningfully interact with AI results – not blind faith. Every answer is explainable, every recommendation is verifiable, and every output can be interrogated.

Through the initiative, Sage and PwC will combine their expertise into practical tools and frameworks to help finance teams understand, assess, and adopt AI responsibly. This includes embedding trust into how AI is implemented in finance environments whilst building on Sage’s existing commitment to SMBs, including the Sage AI Academy, which supports organisations with the knowledge and guidance needed to adopt AI with confidence.

From pilot to practice

To help move organisations from AI experimentation to trusted, scalable adoption, Sage selected PwC as its lead partner, drawn by PwC’s proven expertise in deploying AI across its own business. PwC has embedded AI into day-to-day workflows at scale, with 86% of its employees actively using AI tools, more than 240,000 Microsoft Copilot licences deployed, and over 4,000 custom GPTs developed and reused across the firm.

“PwC’s role is to build trust as technology reshapes how business decisions are made,” said Marco Amitrano, PwC UK Senior Partner. “This initiative with Sage reflects a shared ambition: to ensure AI innovation is grounded in the quality and transparency expected of market-leading finance systems.”

Businesses are increasingly concerned about the probabilistic nature of AI systems, particularly the lack of transparency, explainability, and clear accountability behind AI-generated outputs. Together, Sage and PwC will build transparent AI that gives finance teams control and full visibility into its outputs, backed by the implementation expertise, governance frameworks, and risk management capabilities required to put that AI to work safely, effectively, and at scale.

spot_img
spot_img

━ More like this

Only 1 in 3 families fully secure their devices, Kaspersky study shows

On International Day of Families observed on May 15th, a global Kaspersky study* reveals that while 55% of respondents from South Africa talk about online safety,...

AI-powered search is driving a PR resurgence

AI-powered search is reshaping how people find information. Consumers are no longer scrolling through pages of links on Google or searching social media for...

Expanded agentic AI capabilities coming to SAS Customer Intelligence 360

SAS helps marketers act faster without sacrificing trust with multiple, specialised agents. As artificial intelligence (AI) usage in marketing organisations accelerates exponentially, marketing leaders are...

The Pitfalls of Quick AI Solutions for KYC (Know Your Customer) Processes

Growing regulatory demands pressures compliance and governance. While artificial intelligence (AI) races ahead across the financial services sector, compliance specialists are warning that shortcuts in...

“There is no putting it back in the box”: Salesforce calls on South African organisations to act now on AI fluency and workforce transformation

At an exclusive Johannesburg media roundtable, Salesforce leaders and Standard Bank Group's change lead issue a frank warning: the window for cautious, incremental AI...
spot_img