Overcoming the Black Box Problem In Audit – Using Generative AI Solutions

Overcoming the Black Box Problem In Audit – Using Generative AI Solutions

The emergence of generative AI (Gen AI) and large language models (LLMs) are the catalysts pushing audit firms to adapt traditional audit methodologies and change the way they approach data.

Firms are already integrating these tools into their audit processes with great effect. They recognize the potential for these solutions to enhance existing audit processes, improve auditor productivity, and increase the speed and ease of executing the audit.

But as Gen AI solutions become more prevalent, audit firms must also understand their limitations.

Black box Gen AI approaches and systems, particularly those that calculate results and answers without any form of explanation or detailed data references, pose significant challenges to their use by audit firms.

The proper curation and organization of data through the use of an Intelligent Data Management Platform is the necessary step for firms to use Gen AI solutions. The platform overcomes the black box problem by providing transparent, verifiable, and auditable outputs that can be used to leverage Gen AI in a fully trustable manner.

What the Black Box Problem Means for Audit Firms

AI black boxes can be defined as “AI systems with internal workings that are invisible to the user. You can feed them input and get output, but you cannot examine the system’s code or the logic that produced the output.”

In a black box Gen AI system, the model operates in ways that are not readily transparent or explainable to the user. The auditor can input a prompt and receive a response, but can only guess at how the data was transformed.

This makes it difficult to validate the output of AI tools. For example, the model may generate incorrect, nonsensical, or otherwise false information that has no link to the data from the repository. Similarly, a model may arrive at the correct conclusions but for the wrong reasons. It may identify patterns or parameters that are not actually relevant to the original question.

Some AI tools are intentionally designed to be opaque to obscure how the model functions. Others, such as LLMs like ChatGPT, are so large and complex that it is impossible for humans to fully understand how they work.

Regardless of the reasons, the black box problem is significant to auditors because of the need for a verifiable audit trail that provides traceable records of all actions and decisions.

An auditor-in-the-loop approach ensures automated systems are performing as expected. An Intelligent Data Management Platform makes this possible by providing fully transparent and traceable results.

Trust but Verify With Intelligent Data Management

In a recent survey, 52 percent of accounting and audit professionals recognized the necessity of applying Gen AI in their operations.

For these tools to be effective, they require access to a secure, curated, and organized data repository of structured and unstructured information.

An Intelligent Data Management Platform provides auditors with transparent outputs that link to the original transactions and source documentation. Unlike a black box approach, the platform shows where the data came from and what transformations took place. Every report traces back to a term that has been extracted from a document and tested against another document or transaction.

This traceability of the data is the key. Auditors can easily load or stream data into the data lake, link the documents to the transactions, and perform comparisons. In just a few clicks, the auditor can confirm that a transaction was correct based on documented evidence.

With Intelligent Data Management, auditors can understand how the model arrived at its output, reduce the risk of AI hallucinations, and identify missing or inaccurate data for further analysis.

Enabling Generative AI With Intelligent Data Management

The audit industry is in the thick of a significant transformation. Gen AI tools make it possible for auditors to collect, analyze, and test more transactions to perform more thorough and complete audits.

However, auditors should not just accept the output of AI tools at face value. While they don’t need to be data scientists, they do need to understand how the models transform data to arrive at a result.

Black box Generative AI approaches and systems are not viable in the audit process. Instead, an Intelligent Data Management Platform ensures that all outputs are transparent and traceable. This helps auditors to both trust and verify the results provided by AI tools so that they can perform more comprehensive and higher-quality audits.


John is the CEO of Vigilant AI, which he co-founded to link business process documentation to accounting entries to automate audit testing and transaction analysis for higher quality audit results. A graduate of the University of Waterloo, and a winner of the 2013 Ottawa Chamber of Commerce “40 Under Forty” Award, John has over 25 years of experience in bringing new technologies to market, including his previous role with the market leading audit analytics firm, MindBridge.

要查看或添加评论,请登录

John Craig的更多文章

社区洞察

其他会员也浏览了