OODA Loop reports today that “Hackers Have Uploaded Thousands Of Malicious Files To Hugging Face Repository” based on input from Protect AI.
The OODA Loop story reads in part: “The old Trojan horse computer viruses that tried to sneak malicious code onto your system have evolved for the AI era,” said Ian Swanson, Protect AI’s CEO and founder.
“The Seattle, Washington-based startup found over 3,000 malicious files when it began scanning Hugging Face earlier this year. Some of these bad actors are even setting up fake Hugging Face profiles to pose as Meta or other technology companies to lure downloads from the unwary, according to Swanson. A scan of Hugging Face uncovered a number of fake accounts posing as companies like Facebook, Visa, SpaceX and Swedish telecoms giant Ericsson. One model, which falsely claimed to be from the genomics testing startup 23AndMe, had been downloaded thousands of times before it was spotted…”
Mali Gorantla, Chief Scientist at AppSOC had this to say:
“It should surprise no one that Hugging Face has become a magnet for malware and bad actors. In the last year, the number of AI models available on Hugging Face has tripled, now topping 1 million. Data scientists and AI developers love experimenting with this vast amount of open-source data to build and train new AI applications. The problem is that most security teams have little visibility into what models or datasets have been downloaded or where they exist. I can’t think of a more obvious place to embed malware, infiltrate corporate defenses, and hide your tracks.”
Security teams need to change their tactics so that they have visibility and are able to uncover this sort of thing. Because this is clearly the next “big thing” that threat actors are engaged in.
Like this:
Like Loading...
Related
This entry was posted on October 23, 2024 at 4:24 pm and is filed under Commentary with tags Hacked. You can follow any responses to this entry through the RSS 2.0 feed.
You can leave a response, or trackback from your own site.
AI repository Hugging Face loaded with malicious files to steal info
OODA Loop reports today that “Hackers Have Uploaded Thousands Of Malicious Files To Hugging Face Repository” based on input from Protect AI.
The OODA Loop story reads in part: “The old Trojan horse computer viruses that tried to sneak malicious code onto your system have evolved for the AI era,” said Ian Swanson, Protect AI’s CEO and founder.
“The Seattle, Washington-based startup found over 3,000 malicious files when it began scanning Hugging Face earlier this year. Some of these bad actors are even setting up fake Hugging Face profiles to pose as Meta or other technology companies to lure downloads from the unwary, according to Swanson. A scan of Hugging Face uncovered a number of fake accounts posing as companies like Facebook, Visa, SpaceX and Swedish telecoms giant Ericsson. One model, which falsely claimed to be from the genomics testing startup 23AndMe, had been downloaded thousands of times before it was spotted…”
Mali Gorantla, Chief Scientist at AppSOC had this to say:
“It should surprise no one that Hugging Face has become a magnet for malware and bad actors. In the last year, the number of AI models available on Hugging Face has tripled, now topping 1 million. Data scientists and AI developers love experimenting with this vast amount of open-source data to build and train new AI applications. The problem is that most security teams have little visibility into what models or datasets have been downloaded or where they exist. I can’t think of a more obvious place to embed malware, infiltrate corporate defenses, and hide your tracks.”
Security teams need to change their tactics so that they have visibility and are able to uncover this sort of thing. Because this is clearly the next “big thing” that threat actors are engaged in.
Share this:
Like this:
Related
This entry was posted on October 23, 2024 at 4:24 pm and is filed under Commentary with tags Hacked. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.