The following is a guest post and opinion of Samuel Pearton, CMO at Polyhedra.

Reliability remains a mirage in the ever-expanding realm of AI models, affecting mainstream AI adoption in critical sectors like healthcare and finance. AI model audits are essential in restoring reliability within the AI industry, helping regulators, developers, and users enhance accountability and compliance.

But AI model audits can be unreliable since auditors have to independently review the pre-processing (training), in-processing (inference), and post-processing (model deployment) stages. A ‘trust, but verify’ approach improves reliability in audit processes and helps society rebuild trust in AI.

Traditional AI Model Audit Systems Are Unreliable

AI model audits are useful for understanding how an AI system works, its potential impact, and providing evidence-based reports for industry stakeholders.

For instance, companies use audit reports to acquire AI models based on due diligence, assessment, and comparative benefits between different vendor models. These reports further ensure developers have taken necessary precautions at all stages and that the model complies with existing regulatory frameworks.

But AI model audits are prone to reliability issues due to their inherent procedural functioning and human resource challenges.

According to the European Data Protection Board’s (EDPB) AI auditing checklist, audits from a “controller’s implementation of the accountability principle” and “inspection/investigation carried out by a Supervisory Authority” could be different, creating confusion among enforcement agencies.

EDPB’s checklist covers implementation mechanisms, data verification, and impact on subjects through algorithmic audits. But the report also acknowledges audits are based on existing systems and don’t question “whether a system should exist in the first place.”

Besides these structural problems, auditor teams require updated domain knowledge of data sciences and machine learning. They also require complete training, testing, and production sampling data spread across multiple systems, creating complex workflows and interdependencies.

Any knowledge gap or error

Go to Source to See Full Article
Author: Samuel Pearton

BTC NewswireAuthor posts

BTC Newswire Crypto News at your Fingertips

Comments are disabled.