Deepfake videos are proving to be a problem where we have seen how convincing they look, but it seems that while deepfake videos are a problem, perhaps deepfake audio could be something that we should start paying attention to in the future, especially after there was a report where an employee of a company was tricked into transferring over $200,000 to a scammer, believing that the scammer was his boss.

Advertising

In a report from The Wall Street Journal, the victim was tricked by the caller into thinking that the caller was the CEO of his company’s parent company in Germany. This was due to the use of a German accent, where apparently it sounded very similar in tone and melody to his boss’s voice, which is why he believed the caller.

He was then told to transfer €220,000 ($243,000 USD) to a Hungarian supplier within an hour, which he did. However, it seems that the scammer pushed their luck too hard because when they called back the third time, they used a different number and thus the victim grew skeptical of the caller’s authenticity and did not comply with the request.

This is actually not the first time that AI has been used to recreate the voices of people. More recently, a company by the name of Desa proved that it is possible by creating a simulation of podcaster Joe Rogan, where it sounds eerily like the real person (which you can check out in the video above).

Filed in General. Read more about and . Source: gizmodo

Related Articles on Ubergizmo