top of page
  • Writer's pictureSarah

Startup trek, episode 8: The Battle

Updated: Dec 13, 2018



Season 1, episode 8, "The Battle"


Lesson: simulated audio and video are entering a step change


This post is part of my ongoing quest to watch every episode of Star Trek: The Next Generation and pull one startup, entrepreneurship, tech, or investing lesson from each.


A rogue Ferengi captain with a personal revenge agenda tries to manipulate Picard's memories of a battle in which Picard (rightfully) killed the Ferengi's son in self-defense. The Ferengi captain, Bok, gifts Picard with his old ship, the Stargazer, so that he can covertly target Picard with a device that transmits thoughts via low frequency transmissions. While Picard deals with the terrible headache and confusion from the device, Bok also plants a forged log on the Stargazer where Picard appears to describe the battle as a mistake and an assault in his own voice.


Data analyzes both logs--that of the Stargazer and Picard's personal one--and finds discrepancies to suggest that one is a forgery. Yet it's such a quality fake that they don't know which is real. Of course it turns out that the Stargazer log is a fake: the Ferengi are untrustworthy yet again and Picard did, in fact, save the day during that battle almost a decade earlier.


 

Falsified audio and video already exist today. Like with much of technological development, porn pioneered these techniques. Get in a place where you can do an NSFW search and look up "deepfakes." This has become a broader term to describe any AI-manipulated video where one person's face is swapped convincingly onto another's body, but it started with celebrities and porn. Throw a celebrity's face onto a porn star's body, and boom, you've got celebrity porn. Some of these videos are extremely hard to identify as fake.



Sure, you could use Hollywood-style heavy-handed CG to manipulate video (like those used to add in a younger Carrie Fisher as Princess Leia to Rogue One), but it doesn't scale well: it takes millions of dollars and a ton of time. Thus the societal impact isn't that significant either, because it's inaccessible to most people due to money and time constraints.


But newer deep learning techniques use AI to accomplish virtually the same level of CG fakery in a few hours at almost zero cost. Check out this deepfake version of Leia (bottom) versus the Hollywood version (top). Although these skills still show up mostly in the realm of talented CS experts, efforts have pushed the limit of what accessible AI can do for the average person. Now you can throw your face on actors in famous movie scenes, generate speeches in someone's voice, or spin up a video where you're killing it on the dance floor (although in reality you'd never be able to do such a thing, if you're me). Or, on a more dystopian note, make former president Obama talk about the dangers of fake video in politics as Buzzfeed did here. Weird cultural and legal questions abound.


We're still at the beginning of the deepfake advent today. But given the exponential nature of technology development, deep learning improvements will compound on each other until it's incredibly difficult to tell real video from AI-generated. "Awesome," you may be thinking. "What dumb but amusing gifs we'll be able to send." But we'll also have VR experiences so convincing that we can't tell real from fake, and surveillance videos that are falsified and used as evidence in court cases, and nation states weaponizing speeches they've produced of opponents saying inflammatory things. Maybe, as Elon Musk has said, these developments are what ultimately lead to us living in a simulation. He puts it as "a billion to one chance we're living in base reality" today:

"If you assume any rate of improvement at all then games will become indistinguishable from reality. Even if that rate of advancement drops by a thousand from what it is now, let's just imagine it's 10,000 years in the future, which is nothing on the evolutionary scale." - Elon Musk

Fun. But let's not get ahead of ourselves. If TNG is what I think it is, I'll have plenty of time to talk about simulations later. For now, just know that fake audio and video exist and are only getting more widespread.


bottom of page