Khaby Lame is the influencer who transformed the simplest gesture – opening his hands as if to say “it’s obvious, right?” – into a global language, becoming the most followed creator on TikTok. Today, the mimicry that made him famous risks becoming a replicable industrial asset. According to various sources, the influencer is said to have sold the company linked to his business, Step Distinctive Limited, for a figure close to one billion dollars (975 million), in an agreement that would include the creation of a digital twin powered by artificial intelligence.
The truly disruptive element of this move is not the economic value, but the paradigm shift: not only the image is monetized, presence is automated. A “digital twin”, in this context, is no longer a “fantasy” avatar, but an operational copy that can produce content, adapt to languages and formats, manage channels and establish partnerships without temporal or physical constraints.
The entertainment industry already knows this trajectory: the difference is that now it descends from cinema to the creator economy, with greater speed and permeability. Already in 2022 James Earl Jones authorized the use of archive recordings to recreate with AI the voice of Darth Vader in the series Obi-Wan Kenobi, with the support of Skywalker Sound and the company Respeecher.
The point – as always – is not to demonize technology: in many cases the use is consensual and even narratively sensible. The problem is understanding what happens when identity becomes a product that “works” in your place and, above all, when it can do so even after you have changed your mind. If the digital twin is designed to maximize attention and conversions, it will in fact tend to rigidify style and personality into a formula, rewarding what performs better and penalizing the author’s evolution.

Not by chance, in Hollywood the discussion has shifted onto contractual ground. SAG-AFTRA has published guidelines and informational materials that insist on informed consent and compensation when a “digital replica” of a performer is created, including cases of scanning and subsequent use.
These protections describe a reality: the digital copy is not a simple special effect, but a potential substitute for human labor. And the creator economy, which lives on continuous presence, is an even more exposed terrain: a clone can produce “like you” without your timing, your limits, your choices. From here descend concrete dilemmas: who is responsible if the twin associates Lame’s face with messages, brands or causes that the original would not have supported? Who can switch it off? And under which economic conditions?
On the regulatory level, Europe is trying to reduce ambiguity by imposing transparency obligations: the AI Act provides, among other things, that synthetic contents (deepfakes included) are indicated as such and that in certain cases users are informed when interacting with an AI system. In parallel, the EU Commission has worked on a code of conduct on marking and labeling of contents generated or manipulated by AI, precisely to make recognizable what “resembles the real” but is not. The problem is that the digital twin of a celebrity does not necessarily present itself as “fake”: it can be sold as a legitimate extension of the brand, and therefore slip into the grey zone where the user is not strictly deceived, but not even really put in a position to distinguish.
The Khaby Lame case therefore arrives at a moment when digital identity is contested between three forces: platforms that want to automate it, brands that want to scale it, institutions that try to catch up with it. If the person becomes an infrastructure, the question for everyone, creators, artists, performers, is how much of oneself can be granted to a system that never sleeps, and whether omnipresence is worth the price of losing control over one’s public “self”.
Alessandro Mancini