Popular now
Affinia expands Midlands presence with Towcester acquisition

Affinia expands Midlands presence with Towcester acquisition

The Uncommon Practice appoints director to lead regional growth

The Uncommon Practice appoints director to lead regional growth

Talent shortages force accountancy firms to turn away clients

Talent shortages force accountancy firms to turn away clients

ACCA welcomes gov’s proposed AI cybersecurity code

ACCA welcomes gov’s proposed AI cybersecurity code

Register to get free articles

No spam Unsubscribe anytime

Want unlimited access? View Plans

Already have an account? Sign in

ACCA has welcomed the UK government’s proposed AI cyber code as a useful starting point for a global regulatory approach.

However, the body said the pro-innovation approach of the proposed code – as set out in the government’s white paper – needs to have safeguards and its requirements may need to be revisited. The cyber challenge in AI is dynamic, and a ‘point in time’ view can become quickly outdated, the accountancy body added.

ACCA also highlighted the risks and impacts to end users in small and medium enterprises (SMEs), with a significant number of its members operating in this segment. The greater challenges faced by this group of stakeholders on cyber readiness – across both skills and budgets – are well-documented. 

ACCA wants end-user SMEs to be safe and protected from cyber risk, yet empower them to choose AI given its potential to augment business productivity. 

Glenn Collins, head of technical and strategic engagement ACCA UK, said: “ACCA is pleased to see the consultation taking a principle-based approach as our current view of AI offers too many unseen scenarios. ACCA, its members and partners, will be profoundly impacted by its planned use of AI including delivering finance professionals with an optimal experience and skill set for the modern workplace.”   

In its response, ACCA also called on the government to tackle the skills gap, which needs to be filled in order to combat cyber security risks. The Apprenticeship Levy could be expanded to a ‘Growth and Skills Levy’ that is more flexible and can be used to fund shorter-term accredited training programmes that upskill and reskill workers on the cybersecurity of AI.  

Lastly, ACCA warned that adherence to any code carries a cost, including indirect costs of adhering to the code and the impact through the supply chain. Effort and cost will be needed to raise awareness of the code, as well as monitoring and enforcement

Narayanan Vaidyanathan, head of policy development, ACCA, added: “We anticipate utility from such a code for those providing assurance or third-party verification of AI systems. This is an important category of stakeholders who will have a key role to play in creating a trusted AI eco-system to supplement the regulatory and legal direction from policy makers. 

“We do not anticipate this group to be subject to the requirements of the code itself, but assurance requires checks against a well-defined, and ideally, publicly available standard – which this code could provide. Cyber risks are a part of what the assurance of an AI system may need to check for. Therefore, those providing assurance would find such a cyber code and associated standards helpful.”

Previous Post
Albert Goodman appoints new audit partner

Albert Goodman appoints new audit partner

Next Post
Q&A with Amit Puri, founder and CEO of Pure Tax

Q&A with Amit Puri, founder and CEO of Pure Tax

Secret Link