APRA warns super funds about AI-related risk management and governance

By Jason Pollock on Apr 30, 2026 4:30PM
APRA warns super funds about AI-related risk management and governance

The Australian Prudential Regulation Authority (APRA) has called for a change in how banks, insurers and superannuation trustees manage AI-related risks.

In a letter to industry published today, APRA warned that governance, risk management, assurance and operational resilience practices are not keeping pace with the scale, speed, and complexity of AI adoption.

The letter outlines the findings of a targeted supervisory review APRA undertook late last year across all its regulated industries examining how AI was being deployed and governed.

The review noted that the expanded use of advanced AI is introducing a range of new financial and operational vulnerabilities for entities, but that information security practices are struggling to keep up with the pace of change.

It also warns that frontier AI models such as Anthropic’s Claude Mythos, which could enhance the discovery of vulnerabilities by bad actors, are expected to further increase the probability, speed and scale of cyber attacks.

Other key observations include boards having strong interest for AI’s potential benefits but many lacking the technical literacy required to provide effective challenge to management on AI related risks and oversight.

The review also found that AI functionality is often embedded within broader software platforms or developer tooling, reducing transparency over where and how models are trained, updated or constrained and limiting entities' ability to completely assess and manage risks.

The letter reinforced that APRA expects entities to actively manage information security vulnerabilities and threats.

This includes assessing the implications of AI reliance for operational resilience and business continuity; strong privileged access management, timely patching, hardened configurations, automated vulnerability discovery, penetration testing, and controls over agentic and autonomous workflows; security testing across AI‑generated code, software components and libraries; and ongoing consideration of third-party and concentration implications in relation to common platforms, services, and providers.

APRA also expects entities to establish consistent governance arrangements that include, at a minimum, ownership and accountability across the AI lifecycle; an inventory of AI tooling and AI use cases; human involvement for high-risk decisions and accountability; and training and education of staff on AI use, misuse, limitations and secure practices.

APRA executive board member Therese McCarthy Hockey said regulated entities needed to constantly adjust cyber practices to lift resilience and protect assets in a fast-moving threat environment.

“The AI revolution presents tremendous opportunities for banks, insurers and superannuation trustees to deliver improved efficiency and enhanced customer services. We are already beginning to see these benefits materialise, but we cannot be blind to the risks of such powerful technology – whether in our own hands or the hands of those with malign intent," she said.

“What we’ve observed from our supervisory engagement is that while AI adoption is continuing apace, the systems and processes required to safely govern its use aren’t keeping up. Likewise, the speed at which entities can identify and patch vulnerabilities needs to operate much faster, commensurate with the AI-accelerated threat.

“The findings outlined in today’s letter emphasise our expectations for how entities should be managing these risks in alignment with our prudential standards in areas such as information security, operational risk management, governance and data risk.

“While we are not proposing to introduce additional requirements at this stage, we expect to see a significant improvement in how entities are closing the gaps between the power of the technology they are using and their ability to monitor and control it."

In June of last year, APRA sent out a letter reinforcing its expectations of superannuation funds around information security and the implementation of robust authentication controls.

Got a news tip for our journalists? Share it with us anonymously here.
Copyright © nextmedia Pty Ltd. All rights reserved.

Add techpartner.news as your trusted source

Tags:

Log in

Email:
Password:
  |  Forgot your password?