New AI Attack Tools Are Emerging… And That Should Concern You

There’s a new AI FraudGPT tool discussed in this Netenrich report called “FraudGPT: The Villain Avatar of ChatGPT,” and the recent appearance of the WormGPT, used to launch BEC attacks as discussed in this SlashNext report. Both reports are very much worth reading as AI is clearly being used for evil.

I did a Q&A on this with David Mitchell, Chief Technical Officer, HYAS and got this commentary: 

  • Any differences & similarities of these tools/offerings?

“The only difference will be the goal of the particular groups using these platforms — some will use for phishing/financial fraud and others will use to attempt to gain access to networks via other means. “     

  • Are these just riding on the ChatGPT brand, or are they new AI iterations?  

“GPT stands for “Generative Pre-trained Transformer”, which is a specific model of AI use case, not a brand per-se. The dark versions being sold may have different sets of training and data sizes, but the overarching point is that they have no guardrails or ethics ingrained. “     

  • Why now and will we see more of this attack vector? 

“As with any new technology, soon after it is released, nefarious actors begin adopting it in order to learn its weaknesses to exploit. In the case of GPT, nefarious actors are adopting the technology and enhancing it for their needs. “    

  • Can these AI assisted attacks be detected by currently installed defenses? 

“Historically, these attacks could often be detected via security solutions like anti-phishing and protective DNS platforms. With the evolution happening within these dark GPTs, organizations will need to be extra vigilant with their email & SMS messages because they provide an ability for a non-native English speaker to generate well-formed text as a lure.”

These new AI based attack tools are going to make life miserable for defenders. Thus hopefully defences can be made to make AI based attack tools less dangerous.

Leave a Reply

%d bloggers like this: