This year, TikTok has been lit. One viral highlight has been the series of videos featuring style icon Paris Hilton alongside DeepTomCruise, an AI replica of the beefcake actor, which has over 3.7 million followers on TikTok.
The videos have become a viral sensation. In one clip, he sings Elton John’s “Tiny Dancer” to her, in another, features them eating cereal together in her kitchen, all seemingly normal moments for a not-so normal pair up.
In a third video, which has garnered over 1.8 million views, they get ready to attend a gala in a walk-in closet, where she asks him: “Do you think people will really believe we’re a couple?”
“I think people will believe anything,” says DeepTomCruise.
“Story of my life,” Paris replies.
It’s convincing, especially considering it isn’t really Tom Cruise, but a viral TikTok account created by Miles Fisher that highlights how deepfake technology can demonstrate AI’s power to deceive. Today, it is arguably the most popular deepfake on the internet.
“The goal of every DeepTomVideo is to elicit creative joy and inspire the public imagination,” said Fisher. “Each video is a little jewel box of art with many layers of complexity in scene, dialogue, character, and composition. I don’t define something as viral when everyone in the world watches it. I think something goes viral when the people who watch it end up watching it over and over again.”
This project is the brainchild of Chris Ume, a VFX and AI artist, who created this deep fake technology, alongside Fisher, an actor and singer. Ume is the co-founder of Metaphysic, alongside Tom Graham, where they’re pushing the boundaries of AI and hyperreal, decentralized tech in pop culture, entertainment, and fashion.
The tech firm is no wallflower, they’ve been squarely in the limelight on America’s Got Talent, when they transformed the show’s co-hosts—Howie Mandell, Terry Crews and Simon Cowell—into deepfake singers, in front of a live studio audience, showing how AI can transform the entertainment industry (they also brought AI Elvis to the virtual stage, too, by the way—in a YouTube video that has seen over two million views).
Metaphysic was recently featured at Hilton’s first 11:11 Media conference on November 11, which highlights tech talent linked to her media content company, which she co-founded with Bruce Gersh. Ume and Graham dish on the future of AI and fashion, styling your own hyperreal avatar, and their next upcoming project, Every Anyone.
Forbes: How did this viral set of TikTok videos with Paris Hilton come about?
Thomas Graham: It was a collaboration between Chris Ume and Miles Fisher, who are connected to Paris Hilton and her husband Carter Reum.
We thought, through friendship, why not doing something together? It became something big with her socials, she always puts out a lot of content, a lot of videos, and Miles Fisher as DeepTopCruise always does well on TikTok.
So, it was just really a fun collaboration and a few more videos are on the way, too.
What’s great about Paris Hilton’s online audience?
Graham: Paris has such a lively audience; it’s a good cross section of people who are active and diverse, fun and totally out there. It’s almost outrageous the world they’ve created together—Paris Hilton and DeepTomCruise, the videos are based on the fantastical make-believe idea of a younger Tom Cruise. When you think about it, Paris has a larger than life, magical realism lifestyle. That’s what it is, magical realism.
Chris Ume: It’s something they never seen before, its not real, seeing a deepfake version of Tom Cruise doing silly things with Paris Hilton is what keeps people watching. For people who had a crush on Tom Cruise in the 1990s, it transports you into that place. Everything is better decades ago, that’s how nostalgic works. It’s a powerful tool for engaging empathy and trigger emotions through hyper-real AI content.
What was it like working with America’s Got Talent, turning the judges into AI singers?
Ume: It was a huge challenge, but we worked to pull it off, do our best. For the semifinals on the TV show, you can see how we opened a few new ways technology can be used within the entertainment industry—we even brought Elvis Presley to the stage through deepfake technology. As a creative, what I think is so great, is that in a few years, it’s possible to have a whole video clip where you’re talking with a deepfake Tom Cruise or whoever, that’s how AI works and how its quickly evolving. Having Elvis Presley on the virtual stage is something we can make accessible.
On that note, how else can AI influence how we experience fashion and pop culture?
Graham: It’s not about having avatars like cartoony dragons but an actual realistic avatar that represents you. You could be in Paris Hilton’s Paris World in Roblox. Its who you are as a person. Basing it in reality, makes it an extension of reality—rather than making it an online version, which is easy to pick up and throw away. There are hyper real metaverses that use 3D VR, but we’re three to five years away from that becoming popular, because of the bandwidth it requires. Right now, we are focused on bringing regular people into virtual worlds where they can see a singer who has passed away and render a 30 second clip, like an Elvis concert.
How can AI affect fashion in the metaverse?
Graham: Fashion designers put clothes and fashion into the metaverse. They’re usually worn by models. But what we are doing is putting fashion onto real world people online. That’s more important to me, that connection with real people. Yes, you can go into a Gucci store in the metaverse, but what matters is how you—as an individual—looks when you try those Gucci clothes on. How do you look in them? That’s a feeling that is missing in the metaverse, and fashion initiatives in general, in the metaverse. If you’re putting a Gucci dress onto an online avatar, it doesn’t necessarily connect to who you are as a person, in the way that fashion does in the real world. That’s what we can do, bring real people into this content that is safe and ethical, that aligns with who they are and gives them the chance to imagine something that’s more in line with their daily lives—try on a Gucci dress and go to the virtual MET Gala. Why not?
How are you working with celebrities to help control their hyperreal avatars online?
Graham: We’re trying to find interesting ways to help empower people to own and control their own image and data. We’re also helping high-level celebrities control their own data, as well. We’re helping train talents to control their own image. Nobody else can make hyper real content like we can, we are the only ones dealing with the implications of: ‘what does it mean when there’s a perfect digital version of you online, that can write or say anything in any language in the world?’ There are ethics being built into that DNA and being in control of who you are in the metaverse, is core to what we do.
How is your current project, Ever Anyone, going?
Graham: Next up, we want to create hyper-synthetic real for everyone on the internet. We’re trying to find an ethical safe way for regular people to bring regular data of themselves into content that is magical real and fully synthetic that’s no chance of happening of the real world. Maybe you can relive experience, like having breakfast with your grandparents who passed away. The possibilities are endless.