Articles and videos featuring fruits and vegetables in human form, created by generative artificial intelligence, have flooded TikTok and Instagram in recent weeks. These short, addictive mini-series with reality TV vibes are capturing the attention of millions of users every day. Who is really behind these videos that sometimes flirt with pornography, and for what purpose?
Infidel apples, bananas flirting with eggplants and giving birth to chocolate squares… In just a few days, Priscille became addicted to this new form of AI-generated stories on TikTok. Every evening, this 31-year-old woman gets caught up in these short videos reminiscent of Pixar animated films, which depict love and betrayal stories among fruits, vegetables, and other food products like pieces of chocolate, cheese, or pasta.
Like her, millions of viewers have been following these anthropomorphized fruits and vegetables videos, often hypersexualized, in recent weeks. These fascinating and absurd contents are quickly becoming addictive. Specialized accounts are multiplying at a rapid pace: some, like “OnlyMoviesFr,” have garnered over 58 million views in a week by adapting the concept of popular reality TV shows like L’Île de la Tentation.
“Well-crafted stories, caricatures”
“I can’t stop watching,” says a manager at a discount store chain in Villeurbanne, who subscribed to the channel to watch every episode of the Skibidi Fruits mini-series. “The graphics, the colors… I find it entertaining and funny at the same time, in addition to being very well made,” she comments. The phenomenon is such that internet users beg the account creator to post episodes on time, eager to discover the next adventures.
Charles, a 21-year-old eco-management student in Seine-et-Marne, started creating fruit story videos under the pseudonym “Dr. Einstein” in early March after discovering the concept on English-speaking TikTok. The young man thought, “It shouldn’t be that difficult to do,” and decided to replicate the idea in French. To his surprise, his first video accumulated over 600,000 views in just a few days.

“These are well-crafted stories that are relatable to everyone, caricatures,” admits the student, whose first video told a tale of betrayal between a couple of strawberries and a banana. “The woman gave birth to a banana, creating a whole drama.”
Since then, he has been publishing one to two videos per day and, alongside his studies, manages three TikTok accounts that have accumulated tens of thousands of subscribers in just fifteen days since he started.
“It’s not very complicated to do: my prompts (the instructions given to the AI tool) are no more than 2-3 lines, but it’s not as quick as one might think either. It still requires time and some practice because the AI doesn’t produce the video from A to Z,” explained the student, who dedicates one to two hours of work per video.
“There is a whole process where you have to find a scenario, a good idea. For most of them, I think about it myself. And then it’s an accumulation of small sequences of about ten seconds that need to be assembled using a short video sequence generator – namely the paid version of AI Gemini – that I then put together using the Capcut editing software.”
“Adapting to what the algorithm favors”
However, Charles and Oscar emphasize that the income assured by TikTok is highly variable and unreliable in the long term. “What works today will probably not work in two months,” they observe with lucidity.
Thus, both students, like many other creators, also offer online courses (in the form of PDFs, ebooks, or video modules) to learn how to produce these contents: they detail their methods for around thirty euros. Another easy way to “diversify their income,” according to Oscar.

Behind these initiatives is a whole mechanism specific to social networks, where content creation increasingly responds to viral and profitability logics.
“A content creator today doesn’t really need to think about what content will work, they can quickly test various types of content, flood social networks, and among the thousands of generated things, they just need to see what will grab the most attention,” explains journalist Victor Fersing, who works for the association Lève les yeux, which prevents screen addiction.
According to him, these videos fall into a category known as “AI slop,” low-grade content generated by artificial intelligence. “Other creators will identify that it’s becoming a trend and just need to copy the mechanics. Since it’s very easy to generate content, it tends to flood the internet,” he said.
“Visual content is very appealing because it requires minimal effort. It’s dynamic, it stimulates, it’s like intravenous dopamine. With the increasing ease of content creation and the lack of moderation on these platforms, there are virtually no limits, it goes in all directions,” also analyzed Florence Sedes, a professor at the University of Toulouse and a researcher at IRIT in data science and artificial intelligence.
“It disintegrates our cognitive abilities”
Victor Fersing adds that the automation of these productions results in often incoherent and superficial stories. “Since humans aren’t really behind the story construction, there’s no real coherence or morality. They’re just optimized content to capture attention,” he continues, adding that they offer far less value than a film or series with a real narrative structure and artistic, aesthetic, or moral stakes.
Often filled with adolescent vocabulary like “tana” or “doro”, these videos aim to capture attention like a “mental virus,” diverting our minds to trivial content and depriving us of time for “reading, eating, or exercising”. “They resemble ‘brain rot’ for nothing. It’s so dumbing down that it disintegrates our cognitive capabilities,” he said.
The expert, specializing in screen and algorithm addiction, points out that AI creates versions of reality much more attractive than real life. “It’s eye-catching, it holds our attention, and regardless of how absurd the storylines are, it works,” he said.
He explains that these videos rely on “supernormal stimuli,” exaggerated images and sounds that trigger very strong emotional responses: the animation is extremely dynamic, the characters are very colorful, their bodies hypersexualized, the settings idyllic with seascapes, sunsets, American high school scenes.
“It’s loud and jumps constantly, sometimes flirts with pornography. Typically, we see fruits with model-like bodies or ultra-bodybuilders, exaggerating some parts of the human body to capture attention,” elaborates Victor Fersing.
The risk, adds psychoanalyst Michaël Stora, an expert in digital practices, is “to think that we’re watching something harmless and innocuous when sometimes lurk behind totally nauseating messages” insidiously, that is, racist, misogynistic, or homophobic ideas. “Not all of them do that, but quite cynically or cunningly, some content creators know it will provoke a reaction and generate engagement,” he said.
Platforms with no interest in stopping the “brain rot”
A strategy that Oscar acknowledges, admitting that he sometimes sought to generate videos “bordering on the edge” to provoke reactions from internet users. However, he and Charles firmly deny any intention to instill racist or misogynistic ideas in a roundabout way with their videos. “I understand that people may think so with some videos – on other accounts than mine – that portray women as the villainous and unfaithful one,” notes Charles.
“You have to look at it with a sense of humor,” Oscar defends himself, for whom it is “humorous content”. “Personally, I try to limit the vocabulary, not target a community too much so as not to annoy or offend people too much, even though I admit it is often very cliché.”
The expert Victor Fersing does not rule out that some of these videos conceal political messages. “A large part probably do it solely for monetization, but in the mass, it is entirely possible that some use this media to promote ideas,” he says. “What is the proportion between the two?” he wonders, “it’s very difficult to quantify.”
Ultimately, this type of videos contributes to the economic race between platforms because if TikTok or Instagram were to severely regulate the ‘brain rot’, its competitor would automatically lose market share as addicted users would turn to the other. In essence, each platform has no interest in regulating these contents too much because “addiction is a very good economic model,” according to Victor Fersing.
<p"It's not the most sophisticated intellectual nourishment in history," he concludes, showing how the digital space remains completely unregulated, like an informational wild west where a minor child can easily be exposed to this type of content without having the tools to resist their nature.


