Organisations plan to boost internal audit coverage over the potential risks arising from the deployment of AI technology, according to a survey by Gartner.
For example, planned internal audit coverage for AI-enabled cyber threats and AI control failures leapt from 10 per cent each in 2023 to an expected 52 and 51 per cent in 2024 among chief audit executives, the report said. Internal audits over unreliable outputs from AI models jumped from 14 per cent in 2023 to a planned 42 per cent in 2024.
Confidence gaps
“Perhaps the most striking finding from this data is the degree to which internal auditors lack confidence in their ability to provide effective oversight on AI risks,” said Thomas Teravainen, research specialist at Gartner’s legal, risk and compliance leaders practice. “No more than 11 per cent of respondents rating one of the aforementioned three top AI-related risks as very important considered themselves very confident in providing assurance over it.”
Publicly available and in-house generative AI applications create a range of new or increased risks. Those include data and information security, privacy, IP protection and copyright infringement. There is also concern over the bias and trust of AI outputs, with some complaining that the applications “hallucinate” by providing incorrect information.
Regulatory focus
Better risk management is crucial for firms operating in Europe following the adoption into law of the EU Artificial Intelligence Act in March 2024. The AI Act takes a risk-based approach in its four categories of AI systems: minimal, limited, high and unacceptable risk.
The regulation imposes stiff penalties for those who violate its provisions. Most violations are expected to cost companies €15 million, or 3 per cent of annual global turnover. But for infringements of the unacceptable risk category – which refers to AI-enabled manipulation of people, including the use of biometric data – fines can climb to €35 million or 7 per cent of annual global turnover.
This website uses cookies to ensure you get the best experience on our website.
Read our Privacy Statement & Cookie Policy