Is this video of Tom Cruise fake or real?
Deepfakes — or computer-generated fake videos — are on the rise, and they can be dangerous...
Deepfakes — or computer-generated fake videos — are on the rise, and they can be dangerous...
Category
🤖
TechTranscript
00:00I'm going to show you some magic, it's the real thing.
00:07I genuinely love the process of manipulating people online for money.
00:11President Trump is a total and complete dipshit.
00:14Moving forward, we need to be more vigilant with what we trust from the internet.
00:30If humans are already tricked by, you know, simple spam emails,
00:33I think deepfakes are going to be really effective in the future,
00:36if there aren't tools in place against them.
00:45There was a project called the Video Rewrite Program,
00:48and that was the first major application of a deepfake.
00:50You take one person talking in a video track and sync it to a different audio track,
00:54and, you know, manipulate their face to make it seem like they're talking from the other audio track.
00:59I never met Forrest Gump. I'm a little teapot, short and stout.
01:08The term deepfake comes from deep learning mixed with fake event.
01:11So the fake event is the, you know, output of what a deepfake comes up with.
01:15And then deep learning is the process to come up with the algorithms
01:20and everything to, you know, combine the image with the video.
01:23Deepfake relies on a neural network,
01:25which basically takes a large sample of data and finds patterns throughout them.
01:30So you can take an image and apply it to a video of somebody moving their face.
01:42What's up, TikTok?
01:48Every now and then I like to treat myself.
01:50Deep Tom Cruise, if that was used for malicious purposes,
01:55you know, you could use a platform like TikTok to go viral and spread,
01:58you know, pretty dangerous news.
02:07As a human, if we pull up a video of someone,
02:09if the voices sound similar enough, I feel like the image of the,
02:13you know, the deepfake will kind of trick the ears into like, you know,
02:16being like, oh, hey, that's that person's voice.
02:18And, you know, there are pretty good voice actors who can almost replicate the real thing.
02:39And they take a train, a ton of data of other humans moving their faces,
02:51and then they can apply it to a still image.
02:53But what they have to do is they first have to use algorithms to take that image
02:57at a low quality resolution and turn it into a high quality resolution.
03:00It's bringing deepfakes to the masses in a way that hasn't really been seen before.
03:04I think by being able to take like, for example, like an old deceased family member,
03:08taking one of their pictures and kind of, you know, turning them back into life,
03:11that could be a really cool experience for a lot of people.
03:23Oh, my God.
03:38Taking a world leader and making a deepfake out of them.
03:46And basically, they could, you know, say whatever they want.
03:49They could say things to cause public unrest, you know, say like, I guess,
03:53give dangerous information or dangerous, I guess, like commands, you know, to people,
03:58you know, family members can be impersonated.
04:00So scams can happen that way as well.
04:01I'd say like 10 years ago, when text was the biggest thing, text and images,
04:05it wasn't nearly as big of an issue.
04:07But now, you know, I already feel like the shorter form video platforms
04:11already have a huge like misinformation issue and fake news issue,
04:15because video is so convincing.
04:17I'm sure like hackers are going to get much more creative about this stuff,
04:21especially going forward.
04:29I think big tech is going to be banding together and focusing on tools
04:33that can help prevent deepfakes or at least catch them right away.
04:36And probably like provide labels to say, hey, this like, you know, isn't good,
04:40especially, you know, on Facebook, Instagram, TikTok, any of these apps.
04:45Social media companies are the ones that need to be focusing on creating these tools.
04:48I think at the forefront of all of this,
04:50because they're the ones who'd be most heavily impacted and their users as well.