Deep Fake Losses to Challenge Cyber Insurers, CyberCube Warns

Deep Fake Losses to Challenge Cyber Insurers, CyberCube Warns

2021-01-07 15:41:31
{widget1}

What have Ulysses and his Trojan horse, the pickpocket Fagan in Charles Dickens 'Oliver Twist, and a video from Facebook's Mark Zuckerberg praising how his company is praising its customers' possession & # 39 ;, mean?

They are old and new forms of deception that are misleading targets to believe that an interaction is something that it is not. Today, these interactions are increasingly taking place in cyberspace as technology takes social engineering and mimicry to a new level of sophistication.

Cyber ​​analytics specialist CyberCube warns that the use of deeply fake video and audio technologies could become a major cyber threat to businesses and cyber insurers within the next two years.

Cyber ​​criminals are well equipped to create fake images of real business leaders, such as Zuckerberg's or politicians and other public figures.

In his new report, Social engineering: blur reality and fake, CyberCube says the ability to create realistic audio and video fakes using artificial intelligence and machine learning has steadily increased. In addition, recent technological advances and the increased dependence of companies on video-related communications have accelerated developments.

Due to the increasing number of video and audio clips of business people now accessible online – in part as a result of the pandemic – cyber criminals have a wealth of data to build photo-realistic simulations of individuals, which can then be used to influence. and manipulate people.

In addition, a technology known as mouth mapping developed by the University of Washington can be used to accurately mimic the movement of the human mouth during speech. This supplements existing deeply fake video and audio technologies.

The report's author, Darren Thomson, is the head of CyberCube's cybersecurity strategy in California. He says that as the availability of personal information online increases, criminals are investing in technology to take advantage of this trend. "New and emerging social engineering techniques such as deep fake video and audio will fundamentally change the cyber threat landscape and become both technically feasible and economically viable for criminal organizations of all sizes," he warns.

In reality, in March 2019, cyber criminals used AI-based software to mimic the voice of a chief executive to demand the fraudulent $ 243,000 transfer, the report said. Reports of such cases have so far been rare, but it is not difficult to imagine the picture changing quickly.

Thomson offered a few hypotheses. “Imagine a scenario where a video of Elon Musk giving tips on insider trading goes viral – except it's not the real Elon Musk. Or a politician announces a new policy in a video clip, but again, it's not real We've already seen these deep-seated fake videos used in political campaigns, it's only a matter of time before criminals apply the same technique to corporations and wealthy individuals. It could be as simple as a fake voicemail from a senior manager instructing staff to make a fraudulent payment or transfer money to an account set up by a hacker. "

The CyberCube report also examines the increasing use of traditional social engineering techniques – the exploitation of human vulnerabilities to gain access to personal information and security systems. One facet of this is social profiling, which gathered the information necessary to create a false identity based on information available online or from physical sources, such as rejected or stolen medical records.

According to the report, the blurring of the domestic and business information technology systems created by the pandemic, coupled with the increasing use of online platforms, are making social engineering easier for criminals. Moreover, AI technology makes it possible to create social profiles at scale.

Historically, a criminal using social engineering techniques should imitate a close relationship or colleague in the physical world. Now spoofing an email address or creating a fake social media account may be enough, ”the report said.

CyberCube's Software-as-a-Service platform helps insurance companies underwrite cyber risks and manage cyber risk aggregation. The business information provides insight into millions of companies worldwide and includes models for thousands of points of technological failure.

The report warns insurers that there is little they can do to prevent the development of deep-rooted fake technologies, but emphasizes that risk selection will become increasingly important for cyber insurers.

“There is no silver bullet that translates to zero losses,” says Thomson. However, underwriters must still try to understand how a particular risk relates to information security frameworks. It will also be important to train employees to be prepared for deep fake attacks. "

He said insurers should also consider the potential of deep-seated bogus technology to cause big losses as it could be used in an attempt to destabilize a political system or a financial market.

Source: Reality blur and fake

The most important insurance news, delivered to your inbox every working day.

Receive the trusted newsletter from the insurance industry

[ad_2]
{widget2}

Insurance News