Although emerging technologies such as AI are being developed to increase efficiency and make our lives easier, cases in which these technologies have been exploited are becoming increasingly frequent. Cybercriminals have been extorting innocent people through deepfake technology and the use of manipulated photos and videos to carry out these scams, which have resulted in losses of $2.6 billion last year alone.
Young people and public figures are the most at risk of falling victim to these attacks. These individuals are prone to having their voice cloned due to their large social media presence. While AI Voice Cloning has provided comic relief using voice filters and allowed us to listen to classic songs from different artists, it has also allowed cybercriminals to adopt another avenue of crime. AI tools such as VoiceLab can harvest a person’s voice biometrics, producing a deepfake voice that would sound exactly like them. Coupled with an input script from a movie, it can cause close family and friends to believe their loved one has been abducted.
Additionally, by using ChatGPT, attackers can fuse large datasets of potential victims with voice, video and signal data information, and SIM jacking allows threat actors to control the kidnappee’s phone, making it difficult to track and unreachable.
You can read the full report by Trend Micro here: https://www.trendmicro.com/vinfo/us/security/news/cybercrime-and-digital-threats/how-cybercriminals-can-perform-virtual-kidnapping-scams-using-ai-voice-cloning-tools-and-chatgpt
Like this:
Like Loading...
Related
This entry was posted on July 7, 2023 at 8:36 am and is filed under Commentary with tags Trend Micro. You can follow any responses to this entry through the RSS 2.0 feed.
You can leave a response, or trackback from your own site.
Trend Micro Details How Cybercriminals Use AI and ChatGPT For Extortion Scams
Although emerging technologies such as AI are being developed to increase efficiency and make our lives easier, cases in which these technologies have been exploited are becoming increasingly frequent. Cybercriminals have been extorting innocent people through deepfake technology and the use of manipulated photos and videos to carry out these scams, which have resulted in losses of $2.6 billion last year alone.
Young people and public figures are the most at risk of falling victim to these attacks. These individuals are prone to having their voice cloned due to their large social media presence. While AI Voice Cloning has provided comic relief using voice filters and allowed us to listen to classic songs from different artists, it has also allowed cybercriminals to adopt another avenue of crime. AI tools such as VoiceLab can harvest a person’s voice biometrics, producing a deepfake voice that would sound exactly like them. Coupled with an input script from a movie, it can cause close family and friends to believe their loved one has been abducted.
Additionally, by using ChatGPT, attackers can fuse large datasets of potential victims with voice, video and signal data information, and SIM jacking allows threat actors to control the kidnappee’s phone, making it difficult to track and unreachable.
You can read the full report by Trend Micro here: https://www.trendmicro.com/vinfo/us/security/news/cybercrime-and-digital-threats/how-cybercriminals-can-perform-virtual-kidnapping-scams-using-ai-voice-cloning-tools-and-chatgpt
Share this:
Like this:
Related
This entry was posted on July 7, 2023 at 8:36 am and is filed under Commentary with tags Trend Micro. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.