Deepfake Scams Target Wealthy Italian Businessmen

A more alarming trend In an even more alarming development deepfake scams have become a serious risk to prominent business leaders, especially in Italy. On February 4, Guido Crosetto, Italy’s Minister of Defence, received a call from a friend who is a prominent entrepreneur, alerting him to a sophisticated scam that was targeting notable businessmen…

Liam Avatar

By

Deepfake Scams Target Wealthy Italian Businessmen

A more alarming trend In an even more alarming development deepfake scams have become a serious risk to prominent business leaders, especially in Italy. On February 4, Guido Crosetto, Italy’s Minister of Defence, received a call from a friend who is a prominent entrepreneur, alerting him to a sophisticated scam that was targeting notable businessmen across the country. The fiber optic scam also leveraged deepfake tech to copy voices. This fraud convinced people to wire them thousands, in some cases hundreds of thousands, of dollars.

These dangerous deepfake calls were intended to undermine Crosetto’s credibility. They went after other powerful elites as well, including fashion tycoon Giorgio Armani and Patrizio Bertelli, the CEO of Prada. On such fake calls, Crosetto’s synthetic voice reportedly told the victims to wire some one million euros. That’s about $1.17 million wired to a foreign bank account. On every single call, the scammers disclosed their actual bank account information. Complicating matters even further, they sent follow-ups that purport to be from Crosetto’s actual staff.

Massimo Moratti, the former owner of the Inter Milan football club, was one of the victims of this scam. He wired the money that the scammers asked for. Now, he’s taken his fight a step further by filing a legal complaint with the city’s prosecutor’s office. His experience illustrates the deepening worry about these types of scams—which have been prevalent in this modern, digital world.

The Rising Threat of Deepfake Scams

Crosetto’s incident and the fracas created by other Italian businessmen points to an emerging pattern. Deepfake frauds are increasing in popularity across the globe. In just the first half of this year, victims lost more than $547.2 million to deepfake con jobs around the globe. Losses jumped from a little over $200 million in the first quarter to $347 million in the second quarter. This unsettling trend marks an unprecedented uptick, both in terms of increased frequency and monetary damage.

Researchers have heard the warning bells of this troubling trend and began studies to assess the effectiveness of deepfake technologies. As part of their research, a team created 40 samples of AI voices using a platform known as ElevenLabs. They provided 40 recording samples of real human voices to serve as a benchmark for comparison against the generated deepfakes.

These results show that 41% of AI-generated voices and 58% of voice clones were incorrectly identified as true human voices. The underlying technology that makes deepfakes so convincing has improved as well. This will likely only continue to confuse people and make it harder for them to distinguish between genuine and synthetic communication.

Public Awareness and Precautions

Guido Crosetto has already made some moves to stop these incidents. Specifically, he’s raising public awareness about the risks of deepfake scams, which is worth learning about. On February 6, he posted to social media platform X. His aspiration was to educate other people, especially prominent public figures, on the crimes they are committing through scam baiting. Crosetto said he would rather put these truths out there so the next person doesn’t fall into that same trap.

“I tell him it was absurd, as I already had it, and that it was impossible,” Crosetto stated regarding his reaction when informed about the scam targeting him. His proactive approach seeks to protect others from potential financial loss and to encourage vigilance among business leaders who may be targeted.

Moratti’s situation encapsulates the risks involved in such scams, as he recounted:

“I filed the complaint, of course, but I’d prefer not to talk about it and see how the investigation goes. It all seemed real. They were good. It could happen to anyone.”

People across the country join you in this sentiment. In an age when deepfake technology can create human simulations eerily close to reality, they are more at risk than ever.

Liam Avatar