It looks like the perfect uniform, with the ideal patriotic smile and straight out of a Hollywood-style film. Starship Troopers – but without the irony tone. For months, Jessica Foster was the digital “girlfriend” of the MAGA movement (Make America Great Again). With more than a million followers on Instagram, the supposed American soldier shared photos at air bases, next to fighter jets and in messages of fervent support for Donald Trump.
The problem? Jessica Foster doesn’t exist. It is a product of Artificial Intelligence algorithms designed to explore political niches and convert patriotism into digital subscriptions.
Jessica Foster’s account exploded in popularity in late 2025 and early 2026. In just 120 days, the “influencer” gathered a legion of fans who flooded her comments with “Thank you for your service” and emojis of the United States flag. The images showed her in impeccable uniforms, sometimes as a sergeant, sometimes with elite insignia that would take decades to earn. But none of this made fans suspect that anything was wrong…
It had to be war veterans and technology experts who first raised the alarm. “In one photograph, she was sporting a unit insignia, but the buttons on her uniform were on the wrong side or blended into the fabric,” noted a digital analyst cited by Washington Post. Other images revealed classic AI errors: hands with six fingers, medals that don’t exist in the real army and President Trump with a slightly distorted physiognomy in photos of casual “encounters”.
From Patriotism to ‘AI OnlyFans’
The investigation into Foster’s digital trail revealed a cynical business model: the account’s objective was not just political propaganda, but direct profit. The “soldier’s” biography redirected followers to an account on Fanvue — a competing platform to OnlyFans that specializes in AI-generated models.
There, followers were invited to pay monthly subscriptions to see “exclusive” and more intimate content about the soldier. According to disinformation experts at Boston University, this is a classic case of AI Slop: Mass-generated content with no basis in reality that uses divisive topics (such as politics or nationalism) to attract traffic and monetize the attention of less attentive users.
Although Foster’s face is computer-generated, the “puppet master” behind her is human. According to the Postthe account is linked to a network of “digital entrepreneurs” who operate multiple AI figures simultaneously.
These creators use tools such as Midjourney and the Stable Diffusion to create profiles that appeal to specific niches — from patriotic militarism to fitness — with the aim of channeling traffic to platforms where AI-generated nudity is sold.
The real identity of these individuals remains hidden behind shell companies and digital pseudonyms, taking advantage of the lack of regulation on the identity of synthetic models.
The problem of deep fakes
This case is not isolated. Recently, the television network OAN (One America News) was criticized for using AI-generated images of military recruits in real news reports. The danger of these deep fakesexperts say, is that these “personas” can be quickly converted from marketing in information warfare tools.
“If you can convince a million people that a fake person is a war hero, you can convince them of almost anything,” warns misinformation expert Joan Donovan. Jessica Foster’s case serves as a warning: on the battlefield of social media, not everything that shines with the colors of the flag is human.

Leave a Reply