There has been a significant increase in the use of artificial intelligence in recent years, expected to grow by 47 times in the next 10 years.
Perhaps the most concerning element of artificial intelligence for businesses is the rise in deepfake technology and the sophisticated scams that go with this.
The technology can mimic the appearance, mannerisms and even voices of individuals to make realistic videos or audio content of those individuals saying or doing things they did not say or do. Consider the following example:
In a recent case, a multinational firm was defrauded of $25 million (£19.8 million) after a finance worker took part in a video call with senior members of his team.
He was instructed to make a number of transfers in the video call totalling $25 million, which he did. However, the police investigation later found, all of those involved in the video call with him were sophisticated deepfakes.
The worker even recognised the voices of those involved, despite it not actually being them. Video conferences have become more and more popular, not just in large multi-national companies but in all companies in light of the rise in home working. This makes all employees more susceptible to this type of scam.
It has been reported that the worker was initially suspicious of the request to join the video call from the Chief Financial Officer, however his concerns were eased when others joined the video call as they looked and sounded like colleagues he recognised.
This is not the first time deepfake scams have been successful, however it appears to be the first widely reported multi- person video call scam.
In a recent press conference, police in Hong Kong (where this worker was based) have found stolen ID cards have been used to make loan applications and open bank accounts with deepfakes used to fool face recognition software.
In fact, deepfakes are all over the news not simply in respect of financial scams. Employers should remember that there is the risk of employees being susceptible to blackmail using deepfake images of them which may be embarrassing or even pornographic (e.g. being forced into revealing confidential information including trade secrets or passwords to computer systems).
Avoiding deepfake-driven activity
It is important to provide thorough training to all employees making them aware of these kinds of scams and the steps that can be taken to avoid falling victim to them.
For example, employers may benefit from requiring a strict process to be followed before transfers can be made and ensuring that any unusual requests are checked separately using the usual means of communication.
Employees should also ask questions to verify the identity of those on a video-call or audio call even if they look and sound like their colleagues.
Any training should reiterate the issues regarding email communication including invites to video conferences.
For example, in the reported case, the employee was suspicious of the initial email. If he had contacted the CFO to ensure the video conference request came from him, he would not have joined the video conference.
In a situation like this, an employee is likely to find themselves in difficulty if they have not followed their training in respect of processes to be followed regarding payments and potential IT risks.
The employer may legitimately argue that they have assisted a financial crime by not following procedure.
However, if staff do not have sufficient training in respect of the processes, it would be far more difficult for a business to suggest the employee has committed any sort of misconduct.
Training will not be the only factor that needs to be considered though. The quality of the deepfake may also be relevant. If the quality was not as good, it would be expected that the employee should suspect a scam.
A dismissible offence?
Applied to employers in the UK, if the employee was dismissed for gross misconduct following the transfers in question, he may seek to bring a claim for unfair dismissal, (providing he had two years’ service).
The employer is likely to argue that in light of the level of loss and the fact that he admitted he was suspicious of the original email (and likely failed to follow his training in respect of that email in light of his suspicions), it was within the range of reasonable responses to dismiss him.
The employee is likely to argue that he was unaware of the use of deepfakes in video conferencing technology (provided he had not received training on this) and therefore he could not have been expected to know that he was a victim of a scam.
Whether the dismissal is ruled unfair or not is likely to fall on the training provided to the employee in respect of potential scams of this kind and the internal policies in place regarding transfers.
If the employee followed his training and all procedures regarding transfers, it is likely that any dismissal may be found unfair – despite the significant loss suffered by the employer.
Lessons to be learned
It is therefore important to regularly review your training to ensure that you provide sufficient training to employees so that they raise any concerns immediately and do not join any calls or video calls where they are suspicious.
If you’re an employer, you should consider putting policies in place to ensure any requests for transfers are checked separately to the initial request.
Not all scams will involve transferring money, some may involve downloading software for example or clicking a link.
Again, it is important that employees know they should not do this without considering the source of the instruction, no matter how realistic the audio or video telling them to do it is.
The Online Safety Act 2023 makes it an offence to share pornographic deepfake images or videos, however it does not provide for other deepfake images, videos or audio. When Sadiq Khan (Mayor of London) was the victim of deepfake audio (it was alleged he was recorded making comments in relation to Armistice Day and the Pro-Palestinian March due to take place on the same day), he was told by the Police that no criminal activity had taken place.
These types of attacks, whilst not financial, have the potential to severely damage an individual’s or business’ reputation, particularly due to the speed at which they can be shared on social media.
It is therefore important to consider all the ways deepfake technology can be used to target your business.
If you require assistance in respect of any HR or employment law issue, please contact either Kristie Willis or Lisa Judd from our Employment department.