Tuesday, March 19, 2024
Contact    |    RSS icon Twitter icon Facebook icon  
Unexplained Mysteries
You are viewing: Home > News > Science & Technology > News story
Welcome Guest ( Login or Register )  
All ▾
Search Submit

Science & Technology

Could deepfakes lead to an epidemic of digital crime ?

June 25, 2022 · Comment icon 6 comments

Even US presidents can be deepfaked. Image Credit: YouTube / Halsey Burgund
What if it became almost impossible to tell whether someone's voice or image was actually the real deal ?
Nadia Smaili and Audrey de Rancourt-Raymond from the University of Quebec describe the increasing problem of deepfake technology and how we might go about preventing its misuse.



Deepfakes are video, audio and image content generated by artificial intelligence. This technology can produce false images, videos or sounds of a person, place or event that appear authentic.

In 2018, there were approximately 14,698 deepfake videos circulating online. Since then, the number has soared through the popularity of deepfake apps like DeepFaceLab, Zao, FaceApp and Wombo.

Deepfakes are used in several industries, including filmmaking, video games, fashion and e-commerce.

However, the malicious and unethical use of deepfakes can harm people. According to research by cybersecurity firm Trend Micro, the "rise of deepfakes raises concern: It inevitably moves from creating fake celebrity pornographic videos to manipulating company employees and procedures."

Increased vulnerabilities

Our research found that organizations are increasingly vulnerable to this technology and the costs of this type of fraud can be high. We focused on two public case studies using deepfakes that targeted CEOs and, to date, have estimated losses amounting to US$243,000 and US$35 million respectively.

The first case of fraud occurred at a British energy firm in March 2019. The chief executive officer received an urgent call from his boss, the chief executive of the firm's German parent company, asking him to transfer funds to a Hungarian supplier within an hour. The fraud was presumably carried out using a commercial voice-generating software.

The second case was identified in Hong Kong. In January 2020, a branch manager received a call from someone whose voice sounded like that of the director of the company. In addition to the call, the branch manager received several emails that he believed were from the director. The phone call and the emails concerned the acquisition of another company. The fraudster used deep voice technology to simulate the director's voice.

In both cases, the firms were targeted for payment fraud using deepfake technology to mimic individuals' voices. The earlier case was less convincing than the second, as it only used voice phishing.

Opportunities and threats

Forensic accounting involves "the application of specialized knowledge and investigative skills possessed by [certified public accountants] to collect, analyze and evaluate evidential matter and to interpret and communicate findings in the courtroom, boardroom, or other legal or administrative venue."

Forensic accountants and fraud examiners - who investigate allegations of fraud - continue to see a rise in deepfake fraud schemes.

One type of deepfake fraud schemes is known as synthetic identity fraud, where a fraudster can create a new identity and target financial institutions. For instance, deepfakes enable fraudsters to open bank accounts under false identities. They use these fabricated identities to develop a trust relationship with the financial institution in order to defraud them afterwards. These fraudulent identities can also be used in money laundering.

Websites and applications that provide access to deepfake technologies have made identity fraud easier; This Person Does Not Exist, for example, uses AI to generate random faces. Neil Dubord, chief of the police department in Delta, B.C., wrote that "synthetic identity fraud is reportedly the fastest-growing type of financial crime, costing online lenders more than $6 billion annually."

Large datasets

Deepfakes can enhance traditional fraud schemes, like payment fraud, email hacking or money laundering. Cybercriminals can use deepfakes to access valuable assets and data. More specifically, they can use deepfakes to gain unauthorized access to large databases of personal information.

Combined with social media platforms like Facebook, deepfakes could damage the reputation of an employee, trigger decreases in share values and undermine confidence in a company.

Forensic accountants and fraud investigator need to recognize red flags related to deepfakes and develop anti-fraud mechanisms to prevent these schemes and reduce the associated loss. They should also be able to evaluate and quantify the loss due to a deepfake attack.

In our case studies, deepfakes used the voices of senior management to instruct employees to transfer money. The success of these schemes relied on employees being unaware of the associated red flags. These may include secrecy (the employee is requested to not disclose the request to others) or urgency (the employee is needed to take immediate action).

Curbing deepfakes

Some simple strategies can be deployed to combat the malicious use of deepfakes:

- Encourage open communication: speaking and consulting with colleagues and others about anything that appears suspicious are effective tools to prevent fraud schemes.

- Learn how to assess authenticity: for example, ending a suspicious call and calling back the number to assess the person's authenticity.

- Pause without reacting quickly to unusual requests.

- Keep up-to-date with new technologies that helps detect deepfakes.

- Enhance certain controls and assessment to verify client identity in financial institutions, such as Know Your Customer.

- Provide employee training and education on deepfake frauds.

Cybercriminals may use deepfakes to make their schemes appear more realistic and trustworthy. These increasingly sophisticated schemes have harmful financial and other consequences for people and organizations.

Fraud examiners, cybersecurity experts, authorities and forensic accountants may need to fight fire with fire, and employ AI-based techniques to counter and detect fictitious media.

Nadia Smaili, Professor in Accounting (forensic accounting), Universite du Quebec a Montreal (UQAM) and Audrey de Rancourt-Raymond, Assistant researcher, Universite du Quebec a Montreal (UQAM)

This article is republished from The Conversation under a Creative Commons license.

Read the original article. The Conversation



Source: The Conversation | Comments (6)




Other news and articles
Recent comments on this story
Comment icon #1 Posted by pallidin 2 years ago
Having viewed probably a half-dozen deepfake video clips I am both impressed and deeply disturbed as to how this could misused to mislead unsuspecting individuals on political issues, pranks, "shaming" and even rapid-fire extortion. The latter meaning I could deepfake an 'affair' and demand money NOW, before the scam victim (wife or husband, etc.) could find out the clip sent is fabricated. The technology itself is very cool though.
Comment icon #2 Posted by simplybill 2 years ago
 
Comment icon #3 Posted by quiXilver 2 years ago
Every tool can be used for its intended use and benefit, or as a weapon, or tool of mischief. Every tool comes with the 'tool-maker/user' mammalian tendencies woven into its very fabric. C'est la vie.
Comment icon #4 Posted by Nutrition Fact 2 years ago
Cybergamers lost in fantasyland will be the new empty pods, expendable and not missed much. The ones worst affected will be the hardcore RVers, the warcraft types (been playing 30 years man, Im level 20,000 and havent been outside for years)
Comment icon #5 Posted by Nutrition Fact 2 years ago
People who use the internet in a secondary and noncritical manner should be safest.
Comment icon #6 Posted by Timothy 2 years ago
Deepfake tech will keep getting better. I think that the distinguishing factor will be that it is extremely hard to deepfake something and then retrospectively claim that it came from a certain device. It’s possible, but difficult.


Please Login or Register to post a comment.


Our new book is out now!
Book cover

The Unexplained Mysteries
Book of Weird News

 AVAILABLE NOW 

Take a walk on the weird side with this compilation of some of the weirdest stories ever to grace the pages of a newspaper.

Click here to learn more

We need your help!
Patreon logo

Support us on Patreon

 BONUS CONTENT 

For less than the cost of a cup of coffee, you can gain access to a wide range of exclusive perks including our popular 'Lost Ghost Stories' series.

Click here to learn more

Recent news and articles