Popular now
Affinia expands Midlands presence with Towcester acquisition

Affinia expands Midlands presence with Towcester acquisition

The Uncommon Practice appoints director to lead regional growth

The Uncommon Practice appoints director to lead regional growth

Talent shortages force accountancy firms to turn away clients

Talent shortages force accountancy firms to turn away clients

Question everything: Spotting deepfakes in the finance and accounting sector

Question everything: Spotting deepfakes in the finance and accounting sector

Register to get free articles

No spam Unsubscribe anytime

Want unlimited access? View Plans

Already have an account? Sign in

In the right context, deepfake photos and videos can be entertaining—even comical—but they have the potential to cause widespread panic and perpetuate misinformation. In 2023, an image created using Generative AI purporting to show an explosion at the Pentagon sent shockwaves worldwide and caused stock markets to dip. A year earlier, a deepfake video of Ukrainian President Volodymyr Zelenskyy went viral—and even appeared on a Ukrainian news channel. In it, Zelenskyy tells his soldiers to put down their weapons and surrender to Russia. 

Could I create a deepfake video? 

These are obviously extreme examples, but advances in AI mean that more modest deepfake technology is available to a much wider user base. So, I ran a regular web search to see what I could create with a basic budget and limited experience. I was presented with a series of websites that let you create a very basic deepfake video for around £5.00. I’m guessing most people use these sites for harmless fun—and with no nefarious intentions. But their existence points to something far darker. It should be a warning to accounting and finance teams who might have thought deepfake development costs would deter cyber criminals from targeting their organisation.  

It seems to me that it could be very easy to create a short video or audio message from a supplier, customer or partner requesting a change of bank details or an emergency funds transfer. Send it via WhatsApp, Teams or an email, and it would probably fool a couple of people—and that’s all it takes. That’s precisely what happened to an energy company CEO. He handed over almost £200,000 when an AI application that copied the voice of a senior executive at his parent company requested an urgent funds transfer to a third-party bank account. 

Ability to mimic writing style

As if that wasn’t bad enough, advancements in free-to-use large language models (LLMs) allow them to learn writing patterns and generate text that reads as though it were written by a specific individual. The intended purpose is for LLMs to learn to write like their primary user and assist with email and text creation. But it wouldn’t be difficult to teach an LLM to write like someone else—you’d just need several samples of their writing.  

This means a deepfake threat could just as easily come in the form of a regular email or even a text, requiring even less skill or funding. It would be very easy to cast a wide net with this kind of attack and draw in plenty of SMEs in the process; companies that might have thought themselves too small for cybercriminals to concern themselves with. Fortunately, you can take some simple steps to protect yourself:

Processes 

Revisit your company’s accounting and finance policies, processes, and workflows. If you’re not already doing so, apply Know Your Customer (KYC) and Know Your Customer’s Customer (KYCC) principles to all transactions. And if you have them in place, now’s the time to re-examine them in the light of deepfake scams. A smartphone video from your customer authorising a transaction may no longer offer sufficient protection. 

Verify any payment or invoice changes—not just with the source, but with the originator or a nominated party. Don’t rely on email; if it’s a colleague and they’re in the same building, walk around to their desk. If they’re based elsewhere, pick up the phone. Perhaps agree on a password and ask them to use it to confirm their identity in all communications—and make sure you change that password regularly to avoid it getting picked up by bad actors. Talk to your IT team about how your organisation could use digital fingerprints or other metadata to provide secondary validation beyond text, voice, or video content. Perhaps they are already planning to introduce deepfake detection tools to detect alterations in digital media. 

Culture 

One of the most effective ways of preventing deepfakes from succeeding is to question everything. But that’s hard to do without a supportive culture that gives accounting and finance team members the confidence to challenge and verify payments, cash transfers or supplier account changes—especially if they’re leadership team or board-level instructions. Changing corporate culture isn’t easy, but your HR colleagues should be able to provide the necessary help and support to get the ball rolling.  

Education 

Is your IT department conducting deepfake simulations and informing your team about the latest cyber threats? Do you believe you or other team members could recognise a deepfake-led scam? If the answer is no, it’s worth talking to someone from IT to see if they can assist in bringing everyone up to speed.

Stay alert

Deepfakes can be very convincing, but there are some tell-tale signs. These include facial inconsistencies such as unnatural eye movements, skin texture anomalies, and lip-sync mismatches. Also, watch out for voice tone variations and speech cadence. 

Conclusion

You might think your organisation is too small to matter to cybercriminals. But AI increases the pace and sophistication of attacks. It’s possible to create a deepfake photo in about 30 seconds and a basic video in less than ten minutes. So it’s safer to assume that all accounting and finance teams are potential targets for deepfake attacks. 

Living in a world where we question everything and do not accept the truth as a given—whether on social media, in the news, in print, or on TV—is tough. However, cynicism might be our best defence against deepfakes. We must turn that scepticism to our advantage and remember that, with the rapid growth and advancement of AI, things are not always as they appear—even the most innocuous looking contract or Teams message. By staying informed, maintaining strong IT security and business processes and applying common sense, we can reduce the likelihood that cybercriminals can deceive us with deepfakes.

Previous Post
Implementing a tax control framework for managing pillar two compliance 

Implementing a tax control framework for managing pillar two compliance 

Next Post
WBG appoints former Tesco Bank CEO as chairman

WBG appoints former Tesco Bank CEO as chairman

Secret Link