• 4 months ago
"Tell me what I'm missing: I don't think it's the position of determinists that the person they are debating is incapable of changing their mind. All they say is that your unenlightened perspective is simply the result of each thing that came before the conversation. Maybe it's that your lack of exposure to the right arguments resulted from not caring enough about the topic which resulted from a preference for video games resulting from a difficult childhood resulting from being born to the wrong people in the wrong place and time resulting from everything that produced people, places, and times. They say it's the same for their preference for you to believe as they do. Whether or not you will change your mind after debating them has been determined as well, but I don't see how any of that is inconsistent with their desire for the change. They are happy to be the atoms that will have inevitably done that for you or inevitably not. I imagine it as something like scratching an itch. I don't particularly care what led to whatever dust particle landing on me in an irritating way; I'm going to scratch at it. (Or maybe I won't if it's been determined I'll harbor a kind of contrarian but ultimately meaningless attitude about my predestiny!) Yes, we make what have evolved to feel like choices, but that too was all just determined somewhere in the unspeakably complex movements of matter and energy over time."

GET MY NEW BOOK 'PEACEFUL PARENTING', THE INTERACTIVE PEACEFUL PARENTING AI, AND AUDIOBOOK!

https://peacefulparenting.com/

Join the PREMIUM philosophy community on the web for free!

Also get the Truth About the French Revolution, multiple interactive multi-lingual philosophy AIs trained on thousands of hours of my material, as well as targeted AIs for Real-Time Relationships, BitCoin, Peaceful Parenting, and Call-Ins. Don't miss the private livestreams, premium call in shows, the 22 Part History of Philosophers series and much more!

See you soon!

https://freedomain.locals.com/support/promo/UPB2022
Transcript
00:00Good morning everybody, Stefan Molony from Freedomain, hope you're doing well.
00:04Questions from freedomain.locals.com, always great to hear from the community.
00:09Thank you guys so much for tickling my brain with the bow of feather of your curiosity.
00:14Bow of feathers, I should say, we will create a flightless emu of thought and send it to
00:20the skies.
00:21Yes, we're starting with mixed and chaotic metaphors.
00:24That's just how my brain is working today.
00:27Sometimes I ride my brain, sometimes it rides me.
00:30First question, tell me what I'm missing.
00:32I don't think it's the position of determinists that the person they are debating isn't capable
00:37of changing their mind.
00:39All they say is that your unenlightened perspective is simply the result of each thing that came
00:43before the conversation.
00:46Maybe it's that your lack of exposure to the right arguments resulted from not caring enough
00:50about the topic which resulted from a preference for video games resulting from a difficult
00:56childhood resulting from being born to the wrong people in the wrong place, and time
01:00resulting from everything that produced people, places, and times.
01:05I don't quite get that, but they say it's the same for their preferences, their preference
01:09for you to believe as they do.
01:11Whether or not you will change your mind after debating them has been determined as well.
01:16But I don't see how any of that is inconsistent with their desire for the change.
01:21Whether or not you will change your mind after debating them has been determined as well.
01:27They are happy to be the atoms that will have inevitably done that for you, or inevitably
01:33not, I imagine.
01:34It is something like scratching an itch.
01:36I don't particularly care what led to whatever dust particle landing on me in an irritating
01:39way.
01:40I'm going to scratch at it, or maybe I won't.
01:43If it's been determined, I'll harbor a kind of contrarian but ultimately meaningless attitude
01:49about my predestiny.
01:50Yes, we make what have evolved to feel like choices, but that too was all just determined
01:56somewhere in the unspeakably complex movements of matter and energy over time.
02:01Right.
02:02A very common response, I'm afraid, my good friend.
02:06A very common response, and you have not thought these things through.
02:15Okay, so I'm reaching back into the mists of time from arguments I've had with determinists
02:21over the past 40 years, and my first question is, okay, if you buy a GPS, are you anticipating
02:31that the GPS will change your behavior in some manner?
02:36If you use a GPS, right, it could be on your phone, it could be some mounted thing, right?
02:41So you use a GPS, why do you get a GPS?
02:45Well you get a GPS because your behavior will change.
02:50So let's say you've got to drive downtown, you fire up your phone, you enter your address
02:54and the phone says, oh, there's this construction, there's that traffic, and I'm going to route
03:00you here, and so your behavior changes.
03:03Why?
03:05Well, your behavior changes because the GPS is going to give you new information and that
03:12means that you are going to act in some different manner.
03:17There would be no point buying a GPS if you never used it, or you used it but it didn't
03:22change the way you drove at all, right?
03:25You never took any information from the GPS, whether there's construction or traffic or
03:29road closures or you name it, right?
03:31You would never ever change your behavior.
03:35Not many people would buy a phone and then never use it, just toss it in a drawer and
03:42go about your life as if you didn't have a phone.
03:46How many people who need a house to live in, need a place to live in, buy a house and never
03:52move there and never deal with it and never touch it and wait for the bank to repossess
03:56it?
03:57That would make no sense, right?
03:59So the reason that we get things is because it changes our behavior.
04:04You buy a house, so you go and you live in the house, right?
04:07That makes sense, right?
04:08You buy a phone, you use the phone, it changes your behavior.
04:12So my question to determinists is always this, okay?
04:18You say you're a determinist, what changes?
04:21What changes?
04:23What changes?
04:25Now, if I think that I'm having a relationship, an online relationship with an attractive
04:34female, but it turns out that it is just an AI, right?
04:41I think I'm having a relationship, looking forward to getting married, having kids, right?
04:46And it turns out that the relationship, quote, relationship I'm having is not with a human
04:52female with whom I can get married and have children, but it is an AI, which I cannot
04:59marry and will never provide me children.
05:02Now once I get that knowledge, does my behavior change?
05:07Well, of course it does.
05:08If I want to get married and have kids and I find out that the woman I'm chatting with
05:12is just an AI, then I will stop chatting with her because she's a machine.
05:19Will I sue her for catfishing?
05:21Will I upbraid her and rail against her and send her mean texts about how disappointed
05:25I am and what a terrible thing she is and how she broke my heart?
05:30Like, no, because I've accepted that she is a machine.
05:34Now there may be some randomness in her answers, but she is not a human female.
05:41I will not attempt to morally correct her.
05:45I will, like none of these things, because she is a machine.
05:48So, her responses are not the result of free will.
05:53They are the result of machine algorithms.
05:57So once I realize that the person I'm chatting with has no free will, is not a human female,
06:06has no free will, my behavior towards that entity changes.
06:11I am not talking with Sally, the hot cheerleader.
06:15I'm talking with some Silicon Valley incel code of an ideal woman, right?
06:22You follow?
06:23When I get the new knowledge, right?
06:26When I get the new knowledge, my behavior changes.
06:31And let's say it isn't even to do with having a wife and kids.
06:34If I think I'm speaking to somebody of great influence and power in the world, and then
06:41it turns out that I'm just speaking to an AI, then I will stop, right?
06:46So if there's some Billy Joe Bob or whatever, who's got great power and influence in the
06:49world, and if I change his mind, he's going to change a whole bunch of other people's
06:53minds.
06:54So it's really important to me.
06:55I think I'm dealing with this guy who's got this giant audience or whatever.
06:58And then it turns out that it's just an AI, then I'm going to change my behavior.
07:02I'm going to accept that he's not who I thought he was.
07:06And he does not have free will.
07:08And he is not a he, it's just a bunch of computer code.
07:13So if you understand that, then you understand that once I accept that the entity I'm dealing
07:20with has no free will, my behavior changes, right?
07:27So just understand that.
07:29If I think I'm on a video call debating with someone, but it turns out that it's just some
07:35pre-recorded thing that has fortunately guessed my every statement, I will hang up in disgust
07:40and I will not pretend to continue to have a debate, which is a pre-programmed script
07:45generated by a computer or even just some live AI rendering or whatever it is, right?
07:50So once I find out that the entity I'm dealing with is programmed and has no free will or
07:57choice, I change my behavior.
08:02Now can you imagine if I said, well, you know, I'm a young man and I want to get married
08:06and I want to have kids.
08:07And I've been talking to this great woman from Latvia.
08:08Here's her picture.
08:09Here's her text.
08:11And then you show me irrefutable proof that this quote young woman from Latvia is just
08:18AI generated nonsense.
08:21If I say, well, I'm still going to continue with no change in my interaction.
08:25I do want to get married and have kids, this, you know, hot young thing from Latvia.
08:29You proved to me irrefutably that she's AI.
08:33And I say, no matter, I'm still going to continue to court her.
08:38I'm still going to continue to woo her to the exclusion of all others.
08:42That would be crazy, right?
08:44That would be the mark of a significant mental dysfunction, right?
08:50You would probably be quite alarmed.
08:52I would seem desperate, pathetic in denial of reality.
08:56I want to get married and have kids.
08:57She's an AI.
08:58Don't care.
08:59Still going to court her.
09:00Doesn't matter.
09:01Irrelevant.
09:02I don't want to bring such a thing up, right?
09:04That would be a break from reality.
09:05That would be almost schizophrenic in nature, right?
09:09Unhealthy in the extreme.
09:12So once you realize the entity you're dealing with has no free will and is just pre-programmed,
09:19you change your behavior.
09:21So that's my question to determinists.
09:23Okay.
09:24Let's say you're a determinist.
09:25You believe human beings have no free will.
09:27What changes?
09:29I know that if I think I'm debating with a person, but it turns out I'm debating with
09:32a robot or a computer or whatever, right?
09:35Then I change my behavior.
09:36I drop the debate.
09:37I move on.
09:38I don't re-engage in a debate unless I just want to sort of sharpen my skills or whatever,
09:43right?
09:44I mean, if you can imagine, like, let's take another example.
09:47I mean, I know this sounds a bit repetitive, but it's really important to get this massaged
09:51into the base of your brain, so to speak.
09:54So let's say that a young man is playing a video game.
09:59It's a shoot-em-up video game, right?
10:01And he finds out through some irrefutable proof, he finds out that what he thinks of
10:09as a video game is in fact the control of a mechanized warrior in some war that is actually
10:18slaughtering people.
10:19When he throws a grenade in the game, a grenade gets thrown in real life.
10:26When he shoots a rocket launcher in the game and blows up people, it actually shoots a
10:32rocket launcher in real life and blows up actual real people.
10:35Would he not change his behavior?
10:37Would he not say, oh my gosh, I was an unwitting killer?
10:41And would he not be shocked and appalled, uninstall the game, call the police or something,
10:46right?
10:47He would change his behavior.
10:48Now, can you imagine the mental state of a young man who's playing some shoot-em-up
10:52video game, finds out that he's killing people for real and just boots up the game and continues
10:56on as if nothing had happened.
10:58That would be the mark of unbelievable mental dysfunction.
11:03Even if the killings could be somehow justified in some theoretical just war theory, it still
11:10would be an appalling shock to realize that what you're interacting with are not unalive
11:17digital NPC representations but actual flesh and blood human beings.
11:21In other words, if it went from predetermined, a deterministic video game where the entities
11:28programmed and just look like people or whatever, if it went from determinism to free will,
11:37when it went from digital to actual, when it went from simulated to real life, there
11:42would be a huge shock and a change in behavior, right?
11:44You'd be horrified.
11:45You'd have PTSD.
11:46You'd haunt yourself.
11:47You'd have flashbacks, not of the game, but of the real people you were killing.
11:51It would just be appalling what a wild change that would be in your life.
11:55So you see where I'm going with this, right?
11:58If you go from the belief that you are dealing with people to the belief that you're dealing
12:07with a robot, in other words, if you go from free will to determinism, that you're dealing
12:13with a robot, what changes?
12:16What changes?
12:17Now, if nothing changes, the belief is pretension.
12:22It is an affectation.
12:23It is bullshit, absolute, complete, and total intergalactic stegosaurus-sized bullshit.
12:31It is the equivalent of someone saying, I'm debating with someone very interesting online.
12:36I'm really working to change his mind.
12:38Then you find out you're dating with an AI and nothing changes.
12:41That would be insane.
12:42So if you go from free will to determinism, then what changes?
12:49Now, if the person who's the determinist says, oh, no, no, I still get to change people's
12:53minds.
12:54I still get to debate.
12:55I still get to hold people morally responsible.
12:57I get to do this.
12:59Then nothing changes.
13:01Then it is a belief in determinism, which gives you all the rights, roles, opportunities
13:07and responsibilities of a belief in free will.
13:10In other words, you've taken the actions of someone who believes in free will, rebranded
13:15them as determinism, and changed nothing at all.
13:19Well, determinism means that people can't change their minds, but I'm still going to
13:22try and change people's minds.
13:24Determinism means that debate is futile.
13:25Well, therefore, but I'm still going to debate, right?
13:27This is the same as the guy saying, I want to have a wife and kids, finds out that the
13:31girlfriend he's chatting with online is just AI and continues to woo and want to date her
13:35and marry her.
13:36It's insane.
13:38What changes?
13:40If nothing changes based on the belief, the belief is bullshit pretension.
13:45The belief is a lie.
13:46The belief is a troll.
13:48Now if a determinist believes that I have no free will and therefore never engages with
13:53me in any kind of debate, I mean, I think it's kind of a sad belief, but at least I
13:59can respect the integrity.
14:00Yeah, I don't bother trying to change people's minds because I've accepted that everything's
14:04pre-programmed.
14:06Everyone is an AI.
14:08There is only a simulation of free will and therefore, right, I mean, it's like sort of
14:13these modern intellectuals, I mean, we can all think of who they are, sort of modern
14:17intellectuals who say there's no such thing as free will and then cast moral condemnations
14:22all over the place, right?
14:25Which is like saying an AI is evil.
14:29Nope.
14:30AI is just pre-programmed responses.
14:34It's a word picker.
14:35It is not evil.
14:37So saying an AI is evil is like a man stabs his girlfriend to death and you put the knife
14:45on trial because the knife is what did the killing and the knife is evil.
14:48Nope.
14:49The knife is inanimate and has no free will of its own and therefore, you would not put
14:52the knife on trial.
14:54You would put the man on trial.
14:56Blaming inanimate objects with no free will of their own for moral choices is insane.
15:03But people who are determinists will say, well, we have no free will, but Trump is corrupt.
15:10I mean, it really is a special kind of pathetic madness.
15:18And determinists almost always exclude themselves from their calculations.
15:23They'll say, well, you have no particular choice about what you do.
15:26And I've always said to determinists, okay, what part of your behavior is like, how much
15:32of your behavior is predetermined and give me examples.
15:35Was it predetermined for you to ask me that question?
15:39What part of your, because they always say, well, other people's behavior is programmed.
15:42Other people don't really have any choice and so on, right?
15:45Okay.
15:46So you have no choice in making that statement.
15:48So you're just a robot making sounds.
15:52You're just a robot making sounds.
15:54And why would I interact with a robot making sounds?
15:57And then the moment they make some sort of judgment, it is preferable to believe in determinism
16:03because determinism is true.
16:05Well, first of all, there's no such thing as true or false in a deterministic universe,
16:10because true or false is a choice and a preferred state.
16:13It is preferable you believe things that are true rather than you believe things that are
16:17false.
16:18There is no such thing as true and false in a deterministic universe.
16:22So think of a rock bouncing down a hill.
16:26Is its direction true or false?
16:29No, it just is.
16:30It just is.
16:32Is it right or wrong?
16:33Good or bad?
16:34Moral or immoral?
16:36Nope.
16:37The rock is just down there.
16:38You may not know exactly where the rock's going to land because there's a huge amount
16:40of variables, but there is no preferred state in the universe.
16:47In a deterministic universe, there is no preferred state for the rock to land.
16:52So imagine there's some rover on Mars and in the distance it sees a rock bounce down
16:58a hill on some Martian Mount Olympus volcano.
17:03Well, obviously there's nothing for it to damage.
17:07There's nothing for it to harm.
17:08There's no life for it to affect.
17:10So is it good or bad, right or wrong, true or false, wherever the rock lands?
17:16Well, the question makes no sense.
17:20Preference is a characteristic of life and preference as a concept is a characteristic
17:27of the human mind.
17:29I prefer free speech.
17:33I prefer that Trump not set up a university wherein I perceive he's scamming people.
17:39That's immoral.
17:40That's bad.
17:41That's wrong.
17:42But the moment you say something is good or bad, right or wrong, preferred or unpreferred,
17:45you are talking about a state of free will.
17:47It is preferable for you to believe things that are true than to believe things that
17:51are false.
17:53But in a deterministic universe, there is no true, there is no false, there is no right,
17:58there is no wrong because true or false, right or wrong, moral, immoral, good, bad, these
18:03are all preferred states.
18:05And the moment you accept preferred states, then you accept free will.
18:11It is better for you to believe things that are true than to believe things that are false.
18:15Then you're saying that human beings have a choice between preferred and unpreferred
18:18states and should choose their preferred states, which is true.
18:22So the moment somebody tries to correct you, they've accepted free will.
18:24The moment somebody makes a moral judgment, they've accepted free will.
18:27The moment somebody debates, they're accepting free will.
18:29It's a ridiculous, embarrassing, sad, pathetic position.
18:33And what it comes about is, where it comes about is sophistry.
18:38Where it comes about is sophistry.
18:40In my view, and I can back this up with a lot of experience in debates over the years,
18:46people who are determinists have done some seriously bad things in their life, and their
18:50conscience is plaguing them, and they like to say, well, I never had a choice.
18:54And so what happens is, because their conscience is plaguing them, they say, well, I didn't
18:59have a choice in what I did in life.
19:02I never had a choice about what I did in life, and therefore I'm not to blame.
19:05And this is why determinists are so hell-bent on debating this forever and ever.
19:10They're debating the pangs of their own bad conscience.
19:14And we know this because of the hypocrisy, and when people are foundationally anti-rational,
19:19it's almost always because of a bad conscience.
19:24It's almost always because of a bad conscience, because the arguments I'm making are so simple
19:29and blindingly obvious that all the sophistry in the world, like why would someone care?
19:34All the sophistry in the world is simply to keep a bad conscience at bay.
19:38A bad conscience says, you did wrong, you could have and should have done better.
19:44You did wrong, you could have and should have done better, which is true for all of us about
19:49a lot of things in life.
19:50It's fine.
19:52No big problem.
19:54But people who are determinists are fighting against a theoretical preferred state while
20:02demanding everybody follow a theoretical preferred state, such as accepting determinism in lieu
20:06of free will.
20:07They're trying to change people's minds about whether they can change their minds, which
20:11is such a performative contradiction.
20:13It's such a self-detonating argument that only a bad conscience could make it seem believable.
20:18Hedonism also gives people, sorry, determinism also gives people a kind of hedonism.
20:24Because if they make a bad choice, they can say, well, it was predetermined.
20:27They can't say, well, I should have known better.
20:30I should have thought this through ahead of time.
20:32What I'm doing was wrong and immoral.
20:33I should have done better.
20:34They can't do that.
20:36Because that would be to accept the dictates of their conscience.
20:40The conscience is the part of you that says, you should have done better and you didn't.
20:45So determinism is so anti-rational, it's saying that there is no such thing as a preferred
20:54state in the universe and you have no choice to choose that preferred state.
20:59But it's desperately important to me that you choose this preferred state.
21:03You cannot choose a preferred state both because you have no choice and as a result of having
21:07no choice, there is no preferred state.
21:10So I want to convince you that you have no choice and cannot choose a preferred state.
21:16And to do that, I will say that you should choose a preferred state called determinism
21:21based upon your free will to choose these things.
21:23It's so anti-rational that the only explanation of such craziness is psychological, which
21:29is to say fundamentally moral.
21:31They've done bad things.
21:33They don't want to accept responsibility for their bad things.
21:36They don't want to listen to their conscience.
21:38So they try to drown out their conscience with mechanistic, atomistic explanations of
21:44life.
21:47It is a sad spectacle, honestly.
21:50Let me just be brutally frank with you, and I've been doing this for over 40 years, determinists.
21:54It's a sad spectacle.
21:56It often comes out of child abuse, wherein the child was rigidly controlled and not given
22:01free will.
22:02I remember a debate many years ago where I asked a determinist about his childhood.
22:06It turned out he was locked in a tiny room for most of his childhood and had no choice
22:10to do anything.
22:12And it also has to do with a false forgiveness for the people who did bad things to you.
22:18The ones who abused you and neglected you.
22:19You say, well, they did the best they could with the knowledge they had.
22:22It's all deterministic.
22:24It's all atoms, and the moral dimension gets stripped and removed.
22:28So you give up free will and your capacity for virtue in order to protect abusive parents
22:33and so on.
22:34It's absolutely terrible.
22:36So if they say, this person says, all the determinists say is that your unenlightened
22:41perspective is simply the result of each thing that came before the conversation.
22:45But that doesn't add anything to human knowledge.
22:49In fact, it's a complete red herring.
22:52So if determinists are saying, you are not free to go to Italy if you have no knowledge
23:01of Italy.
23:02Well, I would accept that.
23:04I would accept that.
23:06If there was some magic portal that allowed you to step onto the surface of Mars and breathe
23:11freely, some magic helmet, some magic portal.
23:14When I found out about that, then I would have the choice to go to Mars.
23:17Right now, I don't have the choice to go to Mars because there's no way to get there.
23:21So you did not have the choice to phone people from the woods before satellite phones, cell
23:28phones and so on.
23:29You didn't have that choice.
23:30I remember when somebody first called me from a car.
23:35They were very excited.
23:36It was a friend of mine, called me when I was a teenager, called me from a car.
23:39And of course, all phones had cords.
23:44So of course, I had this image of the car going along the road with a long cable snaking
23:49behind it, right?
23:51So you did not have the choice to call people from the car before cell phones were invented.
23:57Okay.
23:58So what?
24:00Additional information gives additional choices.
24:03So how does that destroy free will?
24:07The saying, we are not omniscient and not omnipotent, therefore we don't have free will,
24:14is a bizarre statement.
24:16It's saying everybody is infinitely ignorant relative to omniscience.
24:21Everybody is infinitely powerless relative to omnipotence.
24:26But that's a false standard.
24:28That's a false standard.
24:30It's like saying as a basketball team owner, I have no preference for tall athletes because
24:34all athletes are infinitesimally tiny relative to the width and breadth of the galaxy.
24:42That's not the standard you use.
24:44You use the standard relative to the basket, right?
24:48The hoop, not relative to the galaxy.
24:52A fruit fly lives for only a few days.
24:54Humans live for an average of 80 to 85 years, depending on the location, at least in the
24:59West.
25:00And so we are, you know, hundreds of thousands of times the length of a fruit fly.
25:08But we don't say to ourselves, I can't believe I've lived for the length of hundreds of thousands
25:13of fruit fly lives.
25:14I'm functionally immortal.
25:15I must be a fruit fly vampire.
25:16That's not what you measure things relative to.
25:19It is certainly true that providing better arguments and better information does in fact
25:26give people more choices, which is why in order to give people more moral choices, we
25:32should give better arguments and more information and better perspectives.
25:38A lot of people feel, well, I can't get rid of Bob, say, even though Bob is corrupt and
25:43immoral and destructive.
25:44I can't get rid of Bob because of some XYZ relationship and you say, well, but the principle
25:48is irrespective of the XYZ relationship and you still have the choice to not have Bob
25:52in your life if he is destructive and immoral and dangerous and abusive and so on.
25:57So that gives people a choice that maybe they didn't think of before and then they have
26:02a choice.
26:03That's good.
26:04So yes, in order to give people more free will, we should give people better arguments.
26:11So if somebody says, well, you know, I have to, I just have to live in the city for X,
26:16Y, and Z reason.
26:17Well, if you say, well, okay, but you could, you know, find some remote work.
26:21You could live in the country.
26:22You could raise some food yourself.
26:23You could get some livestock.
26:25You could, you know, and then these are options and choices.
26:28And then if they choose to live in the city, that's a choice because now they have an option
26:31that they can at least consider.
26:33So yes, we do want to provide people more, better arguments, information, facts, data,
26:37reason, and evidence in order to expand their scope of choice.
26:41Sure.
26:42I don't have the choice to fly, but if somebody gave me magic anti-gravity boots, I could
26:46do that.
26:47So yes, technology, I did not have the choice to have this conversation with you 30 years
26:51ago.
26:52There was no option for me to have this conversation with you 30 years ago.
26:57Couldn't happen.
26:58I could do it now, or if we say 40 years ago, 50 years ago, whenever, right, it was not
27:03possible.
27:04I have occasionally thought over the years that I could have just recorded like way back
27:09in the day in the seventies or eighties, if I was around at this age, I could record all
27:14of these shows on cassette tapes.
27:16And then people could pay me $10,000 to ship them a giant crate of cassette tapes or something
27:21like that.
27:22Right.
27:23Theoretically, that could be possible, but it wouldn't happen in this way and it wouldn't
27:25happen with this immediacy of feedback and you certainly wouldn't be able to find shows
27:29very easily.
27:31So yeah, that's, that's how it be.
27:35So now we have the choice to have this conversation, which in the past we didn't.
27:39I got these questions digitally.
27:40I'm recording this digitally.
27:41It would be broadcast digitally and you can consume it digitally.
27:44Great.
27:45Very convenient, very effective.
27:47So by giving people more information, we expand their choices.
27:51Absolutely.
27:52What's the old thing Henry Ford said about the Model T car?
27:55You can get the Model T in any color you want, as long as it's black, right?
27:59So if you believe that your village elders are the ones who have to pray to the local
28:05gods to figure out who you have to marry, then you go to your elders, they pray to the
28:11village gods, they get some thoughts about who you should marry and then that's who you
28:14should marry.
28:15Then you, in a sense, are determined or predetermined to marry that person because whatever the
28:21village elders say, well, that's the person you're going to marry.
28:23I get that's the kind of determinism.
28:25And if someone comes along and says, well, that's not a rational or valid way to choose
28:28who you're going to marry, then you have the choice on who to marry.
28:32If you believe that, like if you accept that the village elders are not praying to some
28:35gods who know perfectly who you should marry, then you can make your own choice about who
28:38to marry.
28:39So yeah, I get that.
28:40Philosophy brings better arguments and information to people.
28:44What's the insight in that?
28:46If we accept that better arguments bring more choice to people, then we should be making
28:50better arguments that bring more choice to people.
28:52We shouldn't be making terrible arguments that remove choice from people.
28:56Because here's the thing, man, if determinists are wrong, and this is the humility that I
29:03never find in determinists, they just blindly spew all this nonsense with no sense of their
29:08possible responsibility for the expansion of evil and corruption in the world.
29:14They never talk about the potential downsides of being wrong, of being wrong.
29:19If free will and moral philosophy is true, that we have a choice and should choose better,
29:25more moral, then as a determinist, if you convince people of the validity of determinism
29:32and you're wrong, then you are stripping people of their moral ideals and their perceived
29:38ability to choose a better state of ethics or virtue, and thus of love, of self-respect,
29:44of having a good conscience, and so on.
29:46So if a determinist says, you know, I'm kind of drawn into this argument of determinism,
29:51but I'm pretty nervous about being wrong, I'm pretty nervous about being wrong, well,
29:59are you nervous about being wrong?
30:03If determinism is valid, people don't get love, they don't get self-respect, the conscience
30:08is a delusion, virtue is a fantasy, and ethics are a superstition, okay?
30:15Well people have to have some way of making decisions and prioritizing.
30:17Oh, you'd say, well, they don't have any decisions they've done, but people still have that perception.
30:22So by stripping human beings of the capacity for virtue, are you not opening up the gateway
30:26to hell itself?
30:29Because you're taking out the most conscientious people who want to be right and want to be
30:32accurate and willing to pursue arguments to some logical conclusion, or hopefully logical
30:37conclusion.
30:38The evil people, they don't give a crap about determinism versus free will, all they care
30:42about is the will to power.
30:45So you're disabling moral people of their only defense against evil, you're liberating
30:49evil from any restraint based upon conscience, and you're delivering the world into a state
30:54of hell.
30:55If you're wrong, which means you should be incredibly responsible about what you say.
31:02But I have yet once in my life, which is why I say it's conscience-based avoidance or the
31:08avoidance of a based conscience, so to speak, I've yet to meet a determinist who says, oh,
31:14you know, it's a pretty big issue that I'm dealing with, and if I'm wrong, then I'm stripping
31:24good people of the capacity for virtue and empowering evil people to indulge their every
31:29whim.
31:30I am really worried and concerned about the possibility of being wrong and what it might
31:37do.
31:38If I talk conscientious people out of virtue and say to bad people there's no such thing
31:43as badness, disarming the good and unleashing the will to power on earth, which turns it
31:49into hell itself.
31:50Nope, never heard that.
31:52Never heard any concern whatsoever.
31:54It's all this blithe nonsense about atoms and obvious stuff and a complete avoidance
32:00of any self-contradiction.
32:03It's a wretched position.
32:04I would view it as a kind of brain fungus or environmental toxin, and I would stay as
32:09far away from this perspective and, in particular, people who push it as humanly possible.
32:15Now, I'm not including this particular person who wrote the question because they may not
32:19have heard these arguments and I'm trying to shake them out of their dogmatic enablement
32:24of evil slumber, but if somebody, after being exposed to these arguments, continues to push
32:29determinism, they are literally, literally trying to steal what makes you human and turn
32:37you into a machine, and that machine will inevitably serve only the most corrupt.
32:42Freedomay.com slash donate.
32:44Thank you so much for your time and attention.
32:46Lots of love from up here.
32:47I'll talk to you soon.
32:48Should I so choose, and choose I will.
32:50Bye.