Fake AI platforms deliver malware diguised as video content

A clever malware campaign delivering the novel Noodlophile malware is targeting creators and small businesses looking to enhance their productivity with AI tools.

But, in an unusual twist, the threat actors are not disguising the malware as legitimate software, but as content / output created by a legitimate-looking AI tool.

AI as a social engineering lure

“As AI surges into mainstream adoption, millions of users turn daily to AI-powered tools for content creation,” Morphisec security researcher Shmuel Uzan noted.

When searching for such tools online, some of them will consult popular Facebook groups and be lured by viral posts on social media to try out some of this software.

fake AI

The ad for the fake AI tool (Source: Morphisec)

Some users may not want to download new software, but they don’t have anything against uploading a file to a web-based service and received AI-generated content.

“Once on the fake site, users are prompted to upload their images or videos, under the impression that they are using real AI to generate or edit content. At the final stage, users are instructed to download their ‘processed’ content. In reality, they unknowingly download a malicious file,” Uzan explained.

OPIS

The final stage (Source: Morphisec)

The Noodlophile malware

The victims download what looks like a media file, but it’s actually a ZIP (archive) file.

Inside the archive: a file with an expected filename (e.g.”Video Dream MachineAI.mp4″) followed by an alarm-raising extension (.exe) that’s hard to spot, as the attackers put many whitespaces before it.

Running the executable starts the multi-stage malware installation chain, which ends with the Noodlophile infostealer and the XWorm remote access trojan payloads loaded entirely in memory.

XWorm is a known threat, but Noodlophile is a new addition to the malware ecosystem.

“Previously undocumented in public malware trackers or reports, this stealer combines browser credential theft, wallet exfiltration, and optional remote access deployment,” Uzan noted.

The malware communicates with the attackers and exfiltrates information through a Telegram bot. It is sold online as part of a malware-as-a-service (MaaS) model and is likely being distributed by different threat actors.

In this specific campaign, the name of the malicious AI tool / service is Luma DreamMachine, but new fake tools could pop-up any time and other similar campaigns may already be underway.

Subscribe to our breaking news e-mail alert to never miss out on the latest breaches, vulnerabilities and cybersecurity threats. Subscribe here!

Don't miss