Skip to playerSkip to main contentSkip to footer
  • yesterday
At a Senate Commerce Committee hearing last week, Sen. Amy Klobuchar (D-MN) spoke about deepfakes.

Category

šŸ—ž
News
Transcript
00:00Senator Fetterman. Senator Klobuchar. Thank you. Good thoughts, Senator Fetterman. Thank you.
00:04So you guys have been sitting here so long that the Pope has been chosen. Wow.
00:11Well, we don't know who. Congratulations, Amy. The white smoke has come up. Congratulations. What?
00:17Oh, stop. You're welcome. Probably wouldn't work. But in any case, it was, I left for some other
00:25things came back because I had one more question that I wanted to ask, and it's related to just the
00:31whole deepfake issue just because Senator Blackburn and Senator Coons and Senator Tillis and I worked on
00:38this really hard, and Blackburn and Coons are in the lead of the bill. But we have recently seen
00:45deepfake videos of Al Roker promoting a cure for high blood pressure, a deepfake of Brad Pitt asking
00:52for money from a hospital bed. Sony Music has worked with platforms to remove more than 75,000
00:58songs with unauthorized deepfakes, including voices of Harry Styles, Beyonce. I recently met, it's not
01:06just famous people. There is a Grammy-nominated artist from Minnesota. Talked to him about what's
01:16going on with digital replicas. So there's a real concern, and it kind of gets at what Senator
01:21Schatz and I were talking about earlier with the news bill. But I just wanted to make you all aware
01:27of this legislation because there were some differences on this, and now we have gotten a
01:32coalition, including YouTube, supporting it, as well as the Recording Industry Association, Motion Picture
01:41Association, SAG-AFTRA. So it's a big deal, and I'm hoping it's something that you will all look at.
01:49But could you just comment, I would go to you, Mr. Smith, first, about protecting people from having
01:55their likenesses replicated through AI without permission. And even if you all pledge to do it,
02:01our obvious concern is that there will maybe other companies that wouldn't. And that's why I think
02:07as we look at what these guardrails are, the protection of digital, people's digital rights
02:13should be part of this, Mr. Smith. Yeah, no, I think you're right to point to it. It has become a
02:18growing area of concern. During the presidential election last year, both campaigns, both political
02:27parties were concerned about the potential for deep fakes to be created. We worked with both campaigns
02:34and both parties to address that. We see it being used in really ways that I would call abusive,
02:42you know, including of celebrities and the like. I think it starts with an ability to identify when
02:48something has been created by AI and is not a genuine, say, photographic or video image.
02:56And we do find that AI is much more capable at doing that than, say, the human eye and human
03:02judgment. I think it's right that there be certain guardrails and some of these we can apply
03:08voluntarily. We've been doing that across the industry. OpenAI and Microsoft were both part of
03:13that last year. And there are certain uses that probably should be considered across the line and
03:20therefore should be unlawful. And I think that's where the kinds of initiatives that you're describing
03:25have a particularly important role to play. And could you look at that legislation?
03:30Absolutely. Appreciate it. Mr. Altman, just same question, same thing.
03:36Sorry. Of course, we'd be happy to look at the legislation. I think this is a big issue and it's
03:41one coming quickly. I do not believe, I think there's a few areas to attack it. You can talk about
03:47AI that generates content, platforms that distribute it, how takedowns work, how we educate society and how
03:53we build in robustness to expect this is going to happen. I do not believe it will be possible to
03:58stop the generation of the content. I think open source, open weight models are a great thing on the
04:03whole and something we need to pursue. But it does mean that there's going to be just a lot of these
04:08models floating around that can do this. The mass distribution, I think it's possible to put some
04:13more guardrails in place and that seems important. But I don't want to neglect the sort of societal
04:21education piece. I think with every new technology, there's some sort of, almost always some sort of
04:27new scams that come. The sooner we can get people to understand these, be on the lookout for them,
04:32talk about this is a thing that's coming and then nothing that's happening, I think the better.
04:36People are very quickly understanding that content can be AI generated and building new kinds of
04:43defenses in their own minds about it. But still, you know, if you get a call and it sounds exactly
04:47like someone you know and they're panicked and they need help, or if you see a video that was like
04:51the videos you talked about, this like gets at us in a very deep psychological way. And I think we need
04:57to build societal resilience because this is coming. It's coming, but we can, there's got to be some
05:05ways to protect people's privacy rights and you've got to have some way to either enforce it,
05:12damages, whatever, or there's just not going to be any consequences. Absolutely, we should have all
05:16of that. Bad actors still don't always follow the laws and so I think we need an additional shield
05:20or whenever we can have them. But yes, we should absolutely protect that. All right, look forward
05:24to working with you on it. Thank you. So I have to say, Senator Klobuchar's question about
05:31fakes and AI fakes made me feel guilty because I did in fact tweet out an AI generated picture
05:40of Senator Fetterman as the Pope of Greenland. So I am guilty of doing so, although it may not be a
05:46fake, it may be a real... Hey, oh, oh, oh, parody is allowed under the law. Parody is allowed. That
05:53is different than what I'm talking about, but Senator Fetterman should respond.
05:59Yeah, I didn't know. It may be real. It's a good shot, actually.

Recommended