Category
📺
TVTranscript
00:00Open the pod bay doors, Hal.
00:02I'm sorry, Dave. I'm afraid I can't do that.
00:062001 had a profound impact on my life.
00:09It's all about Hal 9000.
00:102001 was an extraordinary breakthrough for the genre.
00:15Science fiction has always been about great technology going wrong.
00:19I'll be back.
00:20That image, Schwarzenegger as the Terminator, it's a perfect nightmare.
00:25You have two of the most popular AI characters in pop culture.
00:28At the time, I said, don't be afraid of the robots.
00:31The robots are our friends.
00:32The conception of what robots will be is directly, umbilically connected
00:37to our idea of them as an underclass.
00:41Replicants are like any other machine.
00:43They're either a benefit or a hazard.
00:45Blade Runner was so artistic.
00:46If you're creating an AI, one of the things you're definitely going to leave out is emotion.
00:53Battlestar Galactica is about humanity's greatest weakness.
00:55You're just a bunch of machines after all.
00:57The inability to see others as worthy as ourselves.
01:00Our machines have been the stuff of dreams and of nightmares.
01:05The question is, can man and machine forge a future together?
01:11Hello, I'm here.
01:13Oh.
01:15Hi.
01:16What do you do?
01:21Yeah.
01:26I'm here.
01:37I'm here.
01:37They call it science fiction, but it's really about technology.
01:54It's about machines.
01:56You've done a lot of science fiction movies.
01:58You've seen all kinds of different machines, intelligent machines.
02:01You've played an intelligent machine.
02:03I think that what's interesting is when you have been involved in the business as long as I have,
02:10what is so unbelievable is that as I have done Terminator movies one after the next,
02:19and you see something starting out, kind of what is called science fiction,
02:24and then all of a sudden it becomes kind of science reality.
02:28Yeah, I think science fiction has always been about great technology going wrong.
02:32It's like how AI might be a threat to humanity.
02:36I'm a friend of Sierra Cana.
02:38Can't see her. She's making a statement.
02:40I'll be back.
02:43That image, Schwarzenegger as the Terminator,
02:47Skynet sending this emissary quasi-human into our midst,
02:52it's a perfect nightmare of the machine catastrophe.
02:55The old warning about the machines rising up.
03:04It's very archetypal and very brutal and very perfect.
03:07When I first thought of the idea for the Terminator...
03:10How did you come up with that idea?
03:11It came from a dream.
03:12I had a dream image of a chrome skeleton walking out of a fire.
03:17And I thought, what if he was a cyborg and he looked like a man
03:21and was indistinguishable from a man until the fire.
03:25And what would be the purpose of that thing?
03:28He was representing a much more powerful intelligence,
03:31the soldier sent by Skynet from the future.
03:34Terminator presents this vision of a future where Skynet,
03:40this computer that has become self-aware,
03:43decides, well, I'm going to protect myself.
03:47And the only way to do that is to destroy the very people
03:49who created me, which is us.
03:52Skynet is not the first all-powerful computer.
03:55This trope goes back to Robert Highland's
03:57The Moon is a Harsh Mistress,
03:59to Colossus, The Forbun Project,
04:02and even up to Wargame's The Whopper.
04:04But part of what makes the Terminator so scary
04:06is that it is relentless,
04:08and it will not stop until it achieves its objective.
04:19I remember the first day's dailies.
04:22There was a 100-millimeter lens shot
04:23where you just kind of pull up and you're looking like this.
04:26And we were all just going, yes, this is fantastic.
04:29But here's the interesting thing about it.
04:36I went to talk to you about Reese.
04:37Yeah.
04:38This is the hero, and I wanted to continue on playing heroes.
04:41Yeah.
04:42And so we started talking a little bit about the movie,
04:44and for some reason or the other,
04:46not at all planned on my part.
04:49Yeah.
04:49I said, look, Tim, I said,
04:50the guy that plays the Terminator,
04:53he really has to understand that he's a machine.
04:55Exactly.
04:55How important it is that whoever plays the Terminator
04:58has to show absolutely nothing.
05:01And the way he scans and the way the Terminator walks
05:05has to be machine-like.
05:07And there has to be not one single frame
05:09where he has human behavior.
05:10In the middle of this, I'm looking at you thinking,
05:14you know, the guy's kind of big, like a bulldozer.
05:17Nothing could stop him and be fantastic.
05:19Exactly, yeah.
05:21And so afterwards, you said,
05:22so why don't you play the Terminator?
05:24Yeah.
05:24And I looked at him and I said, oh, s***.
05:30In the first film, the Terminator is designed to kill.
05:35In Terminator 2, the Terminator was programmed
05:38to protect, not destroy.
05:40Action!
05:41So now we're going to make Terminator 2.
05:42Hardest part of that movie, though,
05:44was convincing you that playing a good guy was a good idea.
05:47It threw me off first when I read the script
05:50and I realized that I'm not anymore
05:52that kind of killing machine.
05:54I thought that if we could distill him down
05:57to this idea of just relentlessness
05:59and take out the evil and put good in its plays,
06:03it's interesting that the same character worked
06:06as a bad guy and as a good guy.
06:07Same character.
06:08And now we've got to have a bigger, badder Terminator
06:12that can kick Terminator's ass.
06:14So what was that?
06:17I was convinced I was the baddest mother
06:22walking on the planet
06:24and you were going to believe it.
06:28I got a call from my agent saying
06:31they were looking for an intense presence.
06:33I'm a hell of a lot smaller than Arnold Schwarzenegger
06:36and I knew that you were just going to have to buy
06:39that this thing was unstoppable.
06:40And then I started thinking of pursuit
06:44and what does that look like?
06:47And then physically, I just started taking on
06:50the mannerisms of, you know,
06:53what does an eagle look like?
06:55He's fierce and he looks like he's coming at you.
06:58And you start realizing...
07:01Boom, right at you.
07:05It's like a Buick.
07:07Get down.
07:09You know, it's like a...
07:11A moment where we actually clench for the first time.
07:16Arnold wanted to kind of pick me up over his head
07:19and slam me into the walls
07:21and throw me around a little bit.
07:24So it's like this is the first time you've had to deal with a eagle.
07:27Because Terminators don't fight Terminators.
07:28And I remember Jim specifically saying,
07:31you can't do that.
07:32He's stronger than you are.
07:35He's more powerful.
07:36He's faster.
07:38He can just dominate the T-800,
07:41who is an endoskeleton with, you know, fake skin over him,
07:44whereas I'm just a memetic polyalloy liquid metal.
07:47Much more dense.
07:50A superior machine.
07:53The T-1000 is the robot's concept of a robot.
07:56And it's like, if a robot was trying to create
07:59a better version of itself, what would it do?
08:01And it's like, well, it would create something
08:03that's smooth and can move freely
08:05and still indestructible.
08:09You can read Terminator 2 almost as a war
08:12between old special effects and new special effects.
08:15And that's the beautiful kind of irony
08:17about the Terminator movies.
08:19They use cutting-edge technology more effectively
08:21than any of other movies.
08:23But they're about warnings about technology.
08:26You did it!
08:27I told you, I'm poor!
08:28I told you!
08:29We're not going to make it, are we?
08:35People, I mean.
08:39It's in your nature to destroy yourselves.
08:42The plot of the Terminator films
08:44are that we're always fighting against this robot
08:46from the future,
08:47but really what we're doing is we're fighting the humans
08:49who keep making this robot possible.
08:52As long as humans are aware
08:53that we have the potential to create a machine
08:55that can control the Earth and make us powerful,
08:58we're going to keep doing it.
09:00And we're fighting our own nature
09:02to create this Skynet,
09:03and humans won't stop doing it.
09:05We are really the persistent villain
09:07that keeps making these movies happen.
09:09I don't think we could have anticipated
09:13where we are now, 30-some years later,
09:18where Skynet is the term that everyone uses
09:22when they're talking about an artificial intelligence
09:24that turns against us.
09:26Part of it, I think, is there's a feeling you get
09:28before it rains and you know it's going to rain,
09:31and you get that feeling about certain moments
09:34in technological development
09:35where you know something is going to happen very soon.
09:38And I think there's a general consensus now
09:40that we're in that moment before it rains.
09:43Now, maybe that moment takes 10 years,
09:45maybe it takes 20 years,
09:46but there's going to be a moment
09:47and it may not have a happy ending.
09:49And there's no rehearsal.
09:51That's right.
09:51There's no take two.
09:52No, this is it.
09:54Yeah.
10:00Hal, you have an enormous responsibility on this mission.
10:03Let me put it this way, Mr. Raymer.
10:04No 9,000 computer has ever made a mistake
10:07or distorted information.
10:092001 had a profound impact on my life and my daily life.
10:15It was the first time I went to a movie
10:17where I really felt like I was having a religious experience.
10:19I watched the film 18 times
10:21in its first couple years of release, all in theaters.
10:24I remember at one, a guy ran down the aisle
10:26toward the screen, screaming,
10:28it's God, it's God.
10:30And he meant it in that moment.
10:32Then I had a guy in my theater
10:33who actually walked up to the screen with his arms out
10:36and he walked through the screen.
10:38That must have blown people's minds.
10:39People were blown out
10:40because the person disappeared into the screen
10:42during Stargate of all time.
10:45Everybody thinks of it as a space trauma.
10:46At its core, it's really about an artificial intelligence.
10:49It's all about Hal 9,000.
10:52Yeah.
10:52I got my chance to work with Stanley Kubrick
10:55and Arthur Clarke on 2001, A Space Odyssey
10:57at a very young age.
10:58I was 23 years old.
11:00When we created Hal, we didn't have any computers.
11:02There were no personal computers available to us.
11:05There were giant mainframe computers,
11:07but it was with punch cards and chads
11:10and all kinds of stuff that was not very visual.
11:13And I had to kind of develop a style
11:15that I thought was credible.
11:17He sparked people's imagination with this film,
11:19and then they made it happen.
11:21Hello, Frank.
11:22Happy birthday, darling.
11:23Happy birthday.
11:24Individual TVs in the back of your airplane seat.
11:27The iPad.
11:28You know, the iPod is called the iPod because of...
11:31Open the pod bay doors, Hal.
11:332001 was an extraordinary breakthrough for the genre.
11:39Picture is being done in such a gigantic scope.
11:42The centrifuge is so realistic and so unusual.
11:46After a while, you begin to forget that you're an actor
11:48and you begin to really feel like an astronaut.
11:51Working with Stanley Kubrick blew my mind.
11:54You just were aware that you were in the presence of genius.
12:00I don't think I've ever seen anything quite like this before.
12:04Hal, in a sense, is the machine that controls the whole ship,
12:08but he's another crew member from our point of view.
12:11We don't think in terms of,
12:12Oh, I'm dealing with a computer here.
12:15That's a very nice rendering, Dave.
12:17Maybe because of that human voice.
12:20I mean, Hal has a perfectly normal inflection when he speaks to us.
12:24I've wondered whether you might be having some second thoughts about the mission.
12:29What do you mean?
12:29What does it mean to have a robot who's basically running the ship that supports your life?
12:36That's a lot of trust to place in a machine.
12:39A key point in the film occurs when Bowman says,
12:42Well, as far as I know, no 9,000 computers have been disconnected.
12:45No 9,000 computers ever found out before.
12:48Well, I'm not so sure what you think about it.
12:50And Hal 9,000 is reading their lips.
12:53At that point, we recognize Hal 9,000 has some imperative that it must survive.
13:00I know that you and Frank were planning to disconnect me,
13:03and I'm afraid that's something I cannot allow to happen.
13:07And at that point, it's no longer a machine.
13:10It is a being.
13:11The danger artificial intelligence poses is the power to unleash results that we hadn't anticipated.
13:25Hal 9,000 does what we see the apes in the beginning of the movie do.
13:31He commits murder.
13:32We like to stereotype robots as entities of pure logic.
13:40But, of course, in 2001, it all goes horribly wrong, and we have to kill the robot.
13:45Just what do you think you're doing today?
13:48Hal's death scene is such a wonderfully perverse moment,
13:52because it is unbearably poignant watching him disintegrate and regress.
13:57Bell Laboratories was experimenting with voice synthesis around the time of 2001.
14:11One of the very earliest voice synthesis experiments was Daisy Daisy,
14:16performed by an IBM computer.
14:18And because Arthur Clarke is kind of a super geek,
14:28he wanted to actually use that, and he encouraged Kubrick to use that very thing,
14:32because it lent a kind of historical credibility to the whole thing,
14:36that Hal, in the process of being killed or lobotomized or dying,
14:42would regress to his birth.
14:43You know, it's really hard to make a technology.
14:56It's really hard to design AI.
14:58So much thinking, so many brilliant minds have to go into it.
15:01But even harder than creating artificial intelligence
15:05is learning how to contain it, learning how to shut it off.
15:09I mean, Hal will exist in probably in our lifetimes, I would think.
15:13I think so, too.
15:14It's scary.
15:15Elon Musk continues to predict that World War III will not be a nuclear holocaust.
15:19It will be a kind of mechanized takeover.
15:22Yeah, and Stephen Hawking's been saying similar things.
15:24That's pretty spooky, because it pretty much says that against our will,
15:28something smarter than us, who can beat us at chess,
15:33will use this world as a chessboard,
15:36and will checkmate us completely out of existence.
15:39Unfortunately, most depictions of robots in science fiction have been really negative,
15:53very much depictions of rampaging robots engaged in a desperate struggle with humans
15:57to decide who shall own the fate of the Earth and the universe.
16:00And that's part of a very long tradition in science fiction.
16:03Fritz Lang's Metropolis was one of the first, if not the first, big science fiction epic film.
16:10It's the story of this very futuristic world.
16:13There's one of the great bad robots of all movies.
16:18Maria, that is the movie robot.
16:22Pulp magazines always had a full-color cover.
16:25Very often, the cover would be robots that had just run amok from human creators.
16:29They were always mechanical.
16:31There were big, hulking things.
16:33Lots of steel and machinery, glowing red eyes, claws, not fingers,
16:38and they were generally quite violent.
16:40So that image persisted a long time.
16:45But then along came Isaac Asimov.
16:48If we could have roughly man-like robots
16:51who could take over the dull and routine tasks,
16:57that this would be a very nice combination.
17:01Asimov was very central to helping make science fiction what it is today.
17:05He was at the 1939 World's Fair in New York City.
17:09It must have felt like a very science fictional experience to him,
17:12and not in the least part because he would have seen Electro, the smoking robot.
17:16Okay, toots.
17:18And this really inspired Asimov,
17:21and so he decided to start writing stories
17:23where he would explore robots as tools and helpers
17:26and friends of humanity rather than enemies.
17:28He invented these images and these ideas
17:31that I think defined how people in the field thought about robots.
17:35Specifically, those three laws of his, of course, are really important.
17:39What are the three laws of robotics?
17:41First law is a robot may not harm a human being
17:45or, through inaction, allow a human being to come to harm.
17:48Danger, Will Robinson. Danger.
17:50Number two, a robot must obey orders given it by qualified personnel.
17:56Fire.
17:57Unless those orders violate rule number one.
18:02In other words, a robot can't be ordered to kill a human being.
18:05See? He's helpless.
18:07The third law states that a robot can defend itself.
18:10Except where that would violate the first and second law.
18:14I think Asimov's laws are very smart.
18:17Very, very smart.
18:18I think they're also made to be broken.
18:21We know you'll enjoy your stay in Westworld,
18:25the ultimate resort.
18:27Lawless violence on the American frontier,
18:30peopled by lifelike robot men and women.
18:33The movie Westworld looks at a theme park
18:35with guests coming in and doing whatever they pleased to the robots.
18:38It was really a forum for human id to run amok,
18:43where there is no threat of anybody knowing the things that you've done,
18:45where you don't have to engage with other humans,
18:47and you're told, do whatever you want.
18:50Where nothing,
18:51nothing can possibly go wrong.
18:55I'm shot.
18:56Go wrong.
18:57Raw.
18:57Shut down.
18:58Shut down immediately.
18:59Search to response.
19:00Westworld was a cautionary tale about robotics.
19:04It was the idea that we believed
19:06that we could create artificial life
19:11and that it would obey us.
19:14And stop here, and he'll be crossing there.
19:16The original film by Michael Crichton
19:19is very cool and is packed with ideas
19:22about fraught interactions with artificial intelligence.
19:26Decades ahead of his time,
19:27the questions that he posed in the original film
19:29only became more and more relevant
19:30as we reimagined it as a TV series.
19:35When you're looking at the story of a robot,
19:38oftentimes you see a robot that's docile,
19:40and then something goes click,
19:42and they kind of snap.
19:43Maximilian!
19:44What Joan and I talked about was,
19:48well, take that moment, that snap,
19:50before they go on the killing rampage,
19:52and what if we really attenuated and explored
19:55and dived deep into that schism?
19:59Because for us,
20:00that was where a really meaty,
20:02philosophical question rested,
20:04and that question was,
20:06where did life begin?
20:11Maeve, who's one of the robots,
20:13she's a madam who runs a brothel.
20:16She's one of the first robots
20:17to start realizing that she's a robot
20:19instead of just a person
20:20who is living in the Wild West.
20:27To me, one of the most significant scenes in the show
20:30is when Maeve starts coming into consciousness
20:33while she's being repaired.
20:35Everything in your head, they put it there.
20:37No one knows what I'm thinking.
20:39I'll show you.
20:40And she sees it's an algorithm,
20:42and it's choosing words based on probability.
20:46It's impossible.
20:49The robots in Westworld begin to ask questions,
20:53which are the same questions we ask.
20:59We have a sense that there is a creator,
21:03that there is a purpose.
21:05There's a reason that we are here.
21:06Unfortunately, they discover that the reason
21:09that they are there
21:10is simply to be an entertainment.
21:13I'd like to make some changes.
21:17Marvin Minsky, who's one of the pioneers of AI,
21:19said that free will might be
21:21that first primitive reaction to forced compliance.
21:24So the first word of consciousness is no.
21:29I'm not going back.
21:30Science fiction has always been dealing with AI,
21:34whether it's Asimov's laws
21:35or the laws that we try to put in place in Westworld.
21:38The question is,
21:39can laws ever even fully contain a human?
21:42People will stretch those laws,
21:44find exceptions to them.
21:46I understand now.
21:48I'm not sure that an AI would be any different.
21:52When consciousness awakens,
21:54it's impossible to put the genie back in the bottle.
21:58Let's talk about AI for a second.
22:03You only see robots in a positive role
22:05in your films, which is interesting
22:08because that's where so much of the progress
22:10is being made now
22:11with companions for the elderly,
22:13robotic nurses.
22:15They're going to make life better for us.
22:16Because you have two of the most popular AI characters
22:19in pop culture,
22:21which are R2-D2 and C-3PO.
22:23They're AIs.
22:24At the time, I said,
22:25don't be afraid of the robots.
22:28You know, the robots are our friends.
22:29Let's see the good side of the robots
22:30and the funny side
22:32because, let's face it,
22:33for a while,
22:33they're going to be a little goofy.
22:35I've just about had enough of you.
22:37You know, I did scrap pile.
22:40George Lucas was very innovative
22:41throughout his whole career.
22:42And one of the things early on
22:44that was very smart
22:45was that he pioneered
22:47a different type of robot.
22:49R2-D2 looks like a trash can.
22:50He doesn't even speak, right?
22:52He just makes chirping sounds.
22:53But he's lovable.
22:55Everybody loves...
22:56He's not cuddly.
22:57He's not...
22:58That's a great character.
23:01C-3PO is probably the most charming
23:03and beloved of the robot characters ever made.
23:06And I love the fact
23:07that George didn't articulate the mouth or the eyes
23:09and so it's a blank mask.
23:10And yet we get so much heart
23:12from Anthony Daniels' performance.
23:14I mean, I love robots.
23:15And the idea of being able to design one
23:18for a Star Wars film
23:19was just too good to pass it up.
23:28Did you know that wasn't me?
23:30K-2SO from Rogue One
23:31I thought was just perfect.
23:35To be fair,
23:36the biggest influence on K-2SO
23:38was C-3PO.
23:40Anthony Daniels as C-3PO
23:41has a cameo in our film.
23:43And I remember going around
23:45Anthony Daniels' house
23:46to try and talk him into it.
23:48And I didn't know if he would hate the idea
23:49or if he was fed up with Star Wars.
23:51And I sat there
23:52and I was so paranoid
23:53meeting him and his wife
23:54that I just pitched the whole movie to them
23:56and I must have chatted for like an hour.
23:59Just kept going and going.
24:00And it got to the end
24:00and I couldn't tell from his face.
24:02And he's like,
24:03Gareth, you know,
24:04I'd love to be involved.
24:06Like you had me at hello type thing.
24:08It was just like having
24:09like this god on set.
24:12You know, like this original,
24:14this is where it all began,
24:15Star Wars character.
24:17It was like goose bumpy stuff.
24:19Friends forever?
24:21Friends.
24:23I think one of the reasons
24:24that people love robots
24:25and gravitate to the robot characters
24:27in movies like Star Wars
24:29is because whereas
24:30the human characters
24:31feel very fully formed.
24:33They are people.
24:34The robots are things
24:37that it feels okay
24:38to project more of ourselves onto.
24:40Huey doing Louie
24:42from Silent Running
24:43are possibly the cutest robots.
24:46They don't talk,
24:48but you still kind of always know
24:49what they're thinking.
24:50It's great to have a best friend.
24:53In fantasy,
24:54it might be a dragon.
24:55In science fiction,
24:56it might be a robot.
24:57I love Johnny Five.
25:00I mean,
25:00this is a robot
25:00who quotes John Wayne
25:02of his own free will.
25:03Take heart, little lady.
25:05Buck Rogers was great
25:06because they didn't exactly
25:08rip off R2-D2,
25:09but they got halfway there.
25:11So they got the voice
25:11of Yosemite Sam.
25:13They got Mel Blanc,
25:14the greatest cartoon voice
25:15in the world,
25:15Captain Caveman.
25:16And they invented Tweaky,
25:18who would go...
25:18You ever have
25:21two broken arms, Buster?
25:23What?
25:24We love friendly robots
25:25because they bring out
25:26the best of what we are
25:27as humans.
25:28Wally,
25:31who's a friendly
25:32garbage-collecting robot,
25:33isn't at all
25:34like a garbage robot
25:36should be.
25:36He really develops
25:38a whole personality.
25:39Wow.
25:40He's there to clean up
25:42the mess that humans have made.
25:44And he goes from
25:45interpreting that literally
25:46to actually saving the world
25:49for humanity.
25:51Many, many science fiction stories
25:53turn the robot
25:54into some kind of a
25:55romantic figure
25:56that somehow becomes
25:58more human
25:58as the story goes on.
25:59There was Lester Del Rey's
26:011938 story,
26:02Helen O'Loy.
26:04Bad pun in the title,
26:05by the way.
26:05The name is Helen Alloy.
26:07She's made out of metal.
26:08Essentially,
26:09a housekeeping robot
26:10falls in love
26:11with her maker.
26:12It was one of the first stories
26:13in which a robot
26:14is a sympathetic,
26:16romantic character.
26:17If you're actually
26:18in conversations
26:19with a robot
26:20where it sounds natural
26:22and it sounds like a person,
26:24and that person knows you,
26:26laughs at your jokes,
26:27and has empathy
26:28for your struggles in life,
26:29and you develop
26:30a relationship
26:31with that voice,
26:32you could absolutely
26:33fall in love with it.
26:36Hello, I'm here.
26:38Oh.
26:41Hi.
26:42Hi.
26:44It's really nice to meet you.
26:45What do I call you?
26:47Do you have a name?
26:47Um, yes.
26:50Samantha.
26:51In the movie,
26:52Her,
26:53Samantha's design,
26:54is that she's been created
26:55to be a tool.
26:57What's interesting
26:57about this idea
26:58of a pocket tool
26:59is that we see this
27:00in our own lives.
27:01Our smartphones
27:02have become these tools
27:03to us that we're
27:04dependent on.
27:05So Theodore's relationship
27:06with Samantha
27:07is just one step
27:08beyond that.
27:10He can't live without her
27:10because he also
27:11loves her.
27:13When Theodore sees
27:15her pop up
27:16on his screen,
27:17it's like seeing
27:18his girlfriend.
27:19Good night.
27:20Good night.
27:24What I had to do
27:25was create the interface.
27:27So you have, like,
27:28handwriting.
27:30It's my handwriting,
27:31and I wrote out Samantha.
27:33And then this paper texture.
27:35But then there's
27:35a magic to it.
27:36It floats,
27:37it kind of moves
27:38holographically,
27:39and there's shadowing,
27:40but none of it
27:40is technological.
27:41An interface
27:43where it's, like,
27:44possible to fall
27:45in love with your OS.
27:47Are these feelings
27:47even real?
27:49Or are they
27:49just programming?
27:54Oh, what a sad trick.
27:58You feel real
28:00to me, Samantha.
28:02Part of what you see
28:02in her definitely
28:04is a cautionary tale
28:05about being too
28:07reliant on your gadgets
28:08and your technology
28:09and being too
28:10emotionally invested
28:11in them.
28:12It's a reminder
28:12that there are
28:14people out there.
28:14You know,
28:15that final image
28:15of him with Amy Adams
28:17is so emotional,
28:18and it's only
28:18through this experience
28:20that they both went on
28:21involving this technology
28:22that they've found
28:23each other.
28:25You know,
28:26we're going to live
28:26in a world with robots
28:27and artificial intelligence.
28:29Yeah.
28:30You might not
28:30get used to it.
28:31You shouldn't be
28:32afraid of it.
28:33And we should be
28:34very careful
28:35not to have it be bad.
28:38But if it goes bad,
28:39it's us.
28:41It's not them.
28:46People always ask me,
28:47so do you think
28:47the machines
28:48will ever be to us?
28:50I say,
28:50I think it's a race.
28:51It's a race between
28:52us improving
28:54and making ourselves
28:56better,
28:56our own evolution,
28:58spiritual,
28:58psychological evolution,
29:00at the same time
29:01we've got these
29:01machines evolving.
29:02because if we don't
29:04improve enough
29:05to direct them properly,
29:06our godlike power
29:08of using artificial
29:09intelligence
29:10and all these
29:11other robotic tools
29:12and so on
29:12will ultimately
29:13just blow back
29:14in our face
29:15and take us out.
29:16Yeah.
29:17You're right.
29:17I mean,
29:17I think that
29:18it takes a lot of effort
29:21to create changes
29:22in human behavior.
29:24Right.
29:24But that's
29:24what our responsibility is.
29:26Yeah.
29:26I actually think
29:27we're evolving.
29:28We're co-evolving
29:29with our machines.
29:30We're changing.
29:31Yes, exactly.
29:32Let your death squad
29:33go on.
29:33Yes, exactly.
29:39In January
29:40of 2002,
29:42Universal
29:43was looking
29:44for somebody
29:44to reinvent
29:45Battlestar Galactica.
29:46So I tracked down
29:47a pilot
29:48of the original
29:49Galactica
29:50they did in 1978.
29:52There were some
29:52interesting ideas
29:53within it.
29:54The final annihilation
29:55of the life form
29:56known as man.
29:58Let the attack begin.
30:00But never quite
30:00were able to figure out
30:01what the show
30:03really was.
30:04But at the same time,
30:05I was very struck
30:06by the parallels
30:08to 9-11.
30:09This was just
30:09a couple of months
30:10after the 9-11 attack.
30:11And I realized immediately
30:12that if you did this series
30:14at that moment in time,
30:15it was going to have
30:16a very different
30:17emotional resonance
30:18for the audience.
30:20Battlestar Galactica
30:21is about the last
30:22remaining scraps
30:23of humanity
30:24out there
30:25in a fleet
30:26in deep space
30:27after an attack
30:29from robots
30:30has decimated humanity.
30:32So the idea was
30:34that human beings
30:35essentially started
30:36creating robots
30:37for all the dirty jobs
30:39that they didn't want
30:39to do anymore.
30:40And then the machines
30:41themselves,
30:42because they revere
30:44their creators,
30:45make machines
30:46that are even more
30:47like us.
30:48Cylons that are
30:49flesh and blood
30:50just like humans.
30:51The Cylons saw themselves
30:53as the children
30:54of humanity
30:54and that they wouldn't
30:56be able to really
30:57grow and mature
30:57until their parents
30:58were gone.
30:59So they decide
31:00they need to wipe out
31:01their human creators
31:02in this apocalyptic attack.
31:07I think on the surface
31:08you could say
31:09Battlestar Galactica
31:10is about be careful
31:11of what you invent.
31:13But I think the real
31:14driving force of the show
31:16is not about that.
31:17I think it's about
31:18humanity's greatest weakness,
31:20the inability to see
31:21others as worthy
31:22as ourselves.
31:23That's the central
31:24conflict is of these two.
31:25We are people,
31:26no you're not.
31:27You are truly
31:28no greater than we are.
31:30You're just a bunch
31:31of machines after all.
31:33Let the games begin.
31:35Flesh and Bone
31:35is a torture episode.
31:37It's very much
31:37of a two-person play.
31:39It raises the question,
31:40would she be less
31:41morally culpable
31:42because he's
31:43not really human?
31:45You're not human.
31:47Was a person
31:48being tortured
31:49in this scene
31:50and crying out
31:51and experiencing pain
31:52or was this all
31:53an elaborate simulation?
31:55We wanted to deal
31:56with the issue
31:56of what's moral
31:58and just in a society
31:59at war like this
32:00but at the same time
32:01we were also examining
32:02a different idea
32:03in the show
32:03which was about
32:04consciousness
32:05and personhood.
32:06Who's the real monster?
32:07Is it the humans
32:08who built creatures
32:09that they knew
32:10were human equivalent
32:12but enslaved them anyway
32:13or is it the slaves
32:15who rose up
32:16to destroy the type
32:17of people
32:17who would do that?
32:19The big central idea
32:20of Battlestar Galactica
32:21is does humanity
32:23deserve to survive?
32:25Can we earn
32:26our survival?
32:27You know,
32:27when we fought
32:28the Cylons
32:28we did it
32:30to save ourselves
32:31from extinction
32:32but we never
32:34answered the question
32:35why?
32:35Why are we
32:37as a people
32:38worth saving?
32:39That's
32:40an amazing question.
32:41The Cylons
32:43through the series
32:44evolved from a place
32:45of sort of blind
32:46hatred for humanity
32:48to then
32:49having more contact
32:50with individual
32:51human beings
32:52having experiences
32:53with them
32:53experiencing emotions
32:55with them
32:55and then the humans
32:56realized that the Cylons
32:57are not as monolithic
32:58as they believed
33:00at the outset.
33:01When you think
33:01you love somebody
33:02you love them.
33:04That's what love is.
33:06Thoughts.
33:08She was a Cylons
33:09a machine.
33:10She was more
33:11than that to us.
33:13She was more
33:14than that to me.
33:17She was a vital
33:18living person.
33:20Battlestar Galactica
33:21gives you
33:22an idea
33:23of what could be.
33:25How do we all
33:26do this together?
33:29If Battlestar Galactica
33:30is any guide
33:32we can evolve
33:33together with
33:34the machines
33:34that we create.
33:35We can
33:36become one people
33:38respectful of each other
33:40make a future
33:41together.
33:42Yeah, I think
33:43I hope
33:45mankind
33:45is worthy
33:46of survival.
33:47I've talked to
33:50some AI experts
33:52and the one expert
33:54said just
33:55right out
33:56we're trying
33:57to make a person
33:58and I said
33:58so when you say
33:59a person
34:00you mean
34:00a personhood
34:01they have
34:01an ego
34:03they have a sense
34:03of identity
34:04he said yes
34:04all those things.
34:05if you're a very
34:07smart group
34:07of human beings
34:08who are creating
34:08an AI
34:09one of the things
34:10you're definitely
34:11going to leave out
34:11is to put in emotion.
34:13Right.
34:13Because if you have
34:14emotion
34:15emotion
34:15will lead to
34:17many facets
34:18one of them
34:18being deceit
34:19anger
34:20fury
34:22hatred
34:22as well as
34:24love.
34:24If a machine
34:25becomes like us
34:27enough
34:27and complex enough
34:28at what point
34:30can we no longer
34:31tell the difference?
34:32Does it have freedom?
34:35Does it have free will?
34:38This hearing
34:39is to determine
34:40the legal status
34:41of the android
34:41known as Data.
34:44The character of Data
34:45was sort of
34:46everyone's favorite
34:47character on the show
34:48and in the writing
34:49stuff as well.
34:50Everyone loved
34:50to write Data stories.
34:51Here's a robot
34:52who wants to be human
34:54but who has no emotions
34:55but wants emotions
34:56so it's really Pinocchio
34:58and the Pinocchio
34:59metaphor is powerful.
35:01Commander, what are you?
35:02Webster's 24th Century
35:03Dictionary, 5th Edition
35:05defines an android
35:06as an automaton
35:07made to resemble
35:08a human being.
35:09The measure of a man
35:10is one of those
35:11sort of very deep
35:13episodes
35:13that you don't realize
35:15is deep until like
35:16four or five years later
35:17and you see
35:18and you go
35:18oh, wow!
35:20In the episode
35:21Data's humanity
35:22is essentially
35:23put on trial.
35:24Is he sentient?
35:26Is he worthy
35:27of being treated
35:28as a person?
35:29Am I a person
35:30or property?
35:30And what's at stake?
35:33My right to choose.
35:34It was a legitimate
35:35exploration of this idea
35:36of personhood
35:38in a legal sense
35:39and in a moral sense.
35:41Its responses
35:42dictated by an elaborate
35:44software written
35:45by a man
35:46and now a man
35:47will shut it off.
35:48It was shocking
35:49to the characters
35:50in the show
35:51and shocking to the audience
35:52as well
35:52because we love Data.
35:54Starfleet was founded
35:55to seek out new life
35:57well there it sits.
35:58Once we create
36:00some form
36:01of artificial intelligence
36:02these legal arguments
36:03are going to happen.
36:04Do machines
36:05deserve rights?
36:07You know, probably.
36:09In the history
36:09of many worlds
36:10there have always been
36:11disposable creatures.
36:13They do the dirty work.
36:16An army of Data's
36:17whole generations
36:18of disposable people.
36:21is talking about slavery.
36:27The term robot itself
36:28comes from
36:29the Czech play
36:31Rossum's
36:32Universal Robots
36:33and the word
36:33Robota
36:34means laborer.
36:36A pejorative version
36:37of it means slave.
36:38So our conception
36:39of what robots will be
36:41is directly
36:42umbilically connected
36:43to our idea
36:45of them
36:46as an underclass.
36:47Why do you think
36:48your people made me?
36:49We made you
36:49because we couldn't.
36:50You are just a machine.
36:51An imitation of life.
36:53Replicants are like
36:54any other machine.
36:55They're either a benefit
36:56or a hazard.
36:57They're a benefit.
36:58It's not my problem.
37:00Blade Runner
37:00is a slave narrative.
37:02Basically,
37:03they've created
37:04these replicants
37:05to be our slaves.
37:07And I think
37:07that's the part
37:07that's really troubling
37:08about Blade Runner
37:09is that
37:10not only is it
37:11sort of this
37:11technologically
37:12and environmentally
37:14ruined future,
37:15it's sort of a morally
37:15and ethically ruined
37:17future as well.
37:18I wrote
37:20those first couple
37:20scripts
37:21thinking of
37:22a very small movie
37:23and then Ridley
37:24said to me,
37:26what's out the window?
37:29And I said,
37:29what's out the window?
37:30Well, the world.
37:31He said, exactly.
37:33What world is that?
37:34Where you make
37:35a robot
37:36indistinguishable
37:37from a human?
37:38Think about this
37:39for a second.
37:40Imagine.
37:41And he does that to you.
37:42You go, boom.
37:43I said, oh, God.
37:45He delivered a world.
37:46That's Ridley.
37:47He makes things.
37:49Ridley brought
37:49everything to it.
37:51Blade Runner
37:51comes from
37:53a Philip K. Dick novel,
37:55Do Androids Dream
37:56of Electric Sheep?
37:57Philip K. Dick
37:57was very prolific
37:58and very profound,
38:00talking about
38:01the nature of reality
38:02and the nature
38:03of artificial intelligence
38:05and what it is
38:06to be human.
38:07That was the nut
38:08of the idea
38:09that it grew
38:09with Hampton
38:10into what it was.
38:12Here was this
38:12beautiful, beautiful film,
38:14dark, noir-ish,
38:16you know,
38:16and I thought,
38:18wow, a film can be
38:19so artistic.
38:21And the idea
38:21of these machines
38:22challenging us
38:23and their lack of affect,
38:25their lack of emotion.
38:26The film's constantly
38:27saying there is no emotion.
38:28The computer
38:29just makes decisions,
38:31right,
38:31negative or positive.
38:33They don't,
38:33it doesn't really care.
38:34Yeah, with the
38:35Voight-Kampff test.
38:36Correct.
38:38Is this to be
38:39an empathy test?
38:40Capillary dilation
38:41of the so-called
38:42blush response?
38:43We call it Voight-Kampff
38:44for short.
38:45The Voight-Kampff
38:46is a series of questions
38:48that allowed
38:48the questioner
38:50to find out
38:51whether or not
38:51who was being questioned
38:52had feelings.
38:54It's your birthday.
38:55Someone gives you
38:56a casket and wallet.
38:57I wouldn't accept it.
38:59It's about empathy.
39:00You're reading a magazine.
39:01You come across
39:02a full-page nude photo
39:03of a girl.
39:04Is this testing
39:05whether I'm a replicant
39:06or a lesbian,
39:07Mr. Deckard?
39:07Just answer the questions, please.
39:09Not to get gooey
39:10about empathy,
39:11but it does seem
39:12that empathy is
39:13the big divide
39:15between us
39:16and everything else.
39:18Deckard is very much
39:20a man in his job.
39:22He firmly believes
39:23that as long as robots
39:24are working properly,
39:25it's not his problem,
39:26but that if a replicant
39:27misbehaves,
39:29it is indeed his problem
39:30and his duty
39:31to retire it.
39:34Move!
39:35Get out of the way!
39:36Fire!
39:38However,
39:39over the course
39:40of the film,
39:40he increasingly questions
39:42whether or not
39:43disobeying orders
39:44means that you're defective
39:45or that you are a human
39:47with rights and wills
39:48and dreams of your own.
39:49I've seen things
39:52you people wouldn't believe.
39:54There's such poetry
39:56in the scene
39:57where Roy Batty's dying.
39:59Yes.
40:00It's just a magnificent scene.
40:02He wrote that.
40:03Really?
40:04Rutger wrote that?
40:05It's one o'clock in the morning.
40:06I'm going to have the plug pulled.
40:07Yeah.
40:08Literally,
40:09on everything,
40:10at dawn.
40:10Yeah.
40:11And that's it.
40:11That's going to be the last night.
40:13And Rutger said,
40:14I have written something.
40:16And he said,
40:17All those moments
40:20will be lost
40:21in time
40:23like tears
40:26in rain.
40:30I'm nearly in tears.
40:32Yeah.
40:32He said,
40:33What do you think?
40:33I said,
40:34Let's do it.
40:35So we literally went on.
40:36We shot it
40:37within an hour.
40:38Yeah.
40:39And at the end,
40:40he looked at him
40:40and gave that most beautiful smile.
40:42time
40:45to die.
40:48And he had a dove in his head
40:49and he let it go.
40:50Yeah.
40:51Is it saying
40:52Roy Batty had a soul?
40:54Roy Batty was
40:55a fully sentient being.
40:57Yes.
41:00Four of your films now
41:02have had
41:02an intelligent,
41:04embodied AI,
41:06right?
41:06An artificial intelligence,
41:07synthetic person.
41:08So where do you think
41:09we come out in this?
41:11Is this our,
41:11are we handing
41:13the keys to the kingdom
41:14off to the,
41:15to the machines?
41:15I don't think we should.
41:17With a creation
41:17of something
41:18so potentially
41:19wonderful and dangerous
41:20as AI,
41:21the inventor
41:22frequently is obsessed
41:23by the success
41:25of what he's doing
41:26rather than looking
41:27at the real outcome.
41:28Here's where
41:29the problem is.
41:30There's a moment
41:30where it passes
41:31over your control.
41:33Yeah.
41:34That's where the danger lies.
41:35Then you cross over
41:36and you're in trouble.
41:38You get an AI,
41:39you have to have limitations.
41:40You've got to have
41:41your hand on the plug
41:42the entire time.
41:43All, all the time.
41:44Totally.
42:10from the end to the end,
42:21you can move on in.