• last month
Is this video of Will Smith in Kendrick Lamar's new music video fake or real?

Deepfakes are computer-generated fake videos and they are on the rise. Here's why they can be dangerous ...
Transcript
00:00In a land where hurt people hurt more people, I'm calling it culture.
00:05I genuinely love the process of manipulating people online for money.
00:09President Trump is a total and complete dipshit.
00:24Moving forward, we need to be more vigilant with what we trust from the internet.
00:28If humans are already tricked by, you know, simple spam emails,
00:31I think deepfakes are going to be really effective
00:33in the future if there aren't tools in place against them.
00:44There's a project called the Video Rewrite Program,
00:46and that was the first major application of a deepfake.
00:48And you take one person talking in a video track and sync it to a different audio track
00:52and manipulate their face to make it seem like they're talking from the other audio track.
00:58I'm a little teapot, short and stout.
01:05The term deepfake comes from deep learning mixed with fake events.
01:08So the fake event is the, you know, output of what a deepfake comes up with.
01:12And then deep learning is the process to come up with the algorithms
01:17and everything to, you know, combine the image with the video.
01:20Deepfake relies on a neural network,
01:22which basically takes a large sample of data and finds patterns throughout them.
01:27So you can take an image and apply it to a video of somebody moving their face.
01:42What's up, TikTok?
01:45Every now and then I like to treat myself.
01:48Deep Tom Cruise, if that was used for malicious purposes,
01:52you know, you could use a platform like TikTok to go viral and spread,
01:55you know, pretty dangerous news.
01:57Let me show you some magic.
02:00It's the real thing.
02:11As a human, if we pull up a video of someone and if the voices sound similar enough,
02:15I feel like the image of the, you know, the deepfake will kind of trick the ears into like,
02:20you know, being like, oh, hey, that's that person's voice.
02:22And, you know, there are pretty good voice actors who can,
02:25who can almost replicate the real thing.
02:28So,
02:51they take a train, a ton of data of other humans moving their faces,
02:55and then they can apply it to a still image.
02:57But what they have to do is they first have to use algorithms to
03:00take that image at a low quality resolution and turn it into a high quality resolution.
03:04It's bringing deepfakes to the masses in a way that hasn't really been seen before.
03:08I think by being able to take like, for example, like an old deceased family member,
03:11taking one of their pictures and kind of, you know,
03:13turning them back into life, that could be a really cool experience for a lot of people.
03:27Oh, my God.
03:47Taking a world leader and making a deepfake out of them.
03:50And basically, they could, you know, say whatever they want.
03:52They could say things to cause public unrest, you know, say like,
03:57I guess, give dangerous information or dangerous, I guess, like commands,
04:01you know, to people, you know, family members can be impersonated.
04:03So, scams can happen that way as well.
04:05I'd say like 10 years ago, when text was the biggest thing, text and images,
04:09it wasn't nearly as big of an issue.
04:11But now, you know, I already feel like the shorter form video platforms
04:15already have a huge like misinformation issue and fake news issue,
04:19because video is so convincing.
04:21I'm sure like hackers are going to get much more creative about this stuff,
04:24especially going forward.
04:33But I think big tech is going to be banding together and focusing on tools
04:37that can help prevent deepfakes or at least catch them right away and probably like
04:41provide labels to say, hey, this like, you know, isn't good,
04:44especially, you know, on Facebook, Instagram, TikTok, any of these apps.
04:48These social media companies are the ones that need to be focusing on creating these tools,
04:52I think at the forefront of all of this,
04:54because they're the ones who'd be most heavily impacted and their users as well.

Recommended