• last year
In this episode, I explore the intricate relationship between genetic concerns and the desire for parenthood, responding to common listener inquiries about navigating these challenges. I discuss pro-natalism while addressing the risk of transmitting genetic issues, highlighting the difference between deductive and inductive reasoning in decision-making. Through relatable examples, I analyze how we assess risk when considering starting a family.

I critique contemporary media for its portrayal of instincts, arguing that it often undermines rationality. The conversation shifts to the nature of risk assessment, emphasizing that moral philosophy provides absolutes, but personal choices regarding known genetic issues require individual judgment. I share anecdotes to illustrate how love and support play pivotal roles in these difficult decisions. By the end of the episode, I aim to empower listeners with the understanding that they must navigate these complex choices thoughtfully and with self-awareness.

GET MY NEW BOOK 'PEACEFUL PARENTING', THE INTERACTIVE PEACEFUL PARENTING AI, AND AUDIOBOOK!

https://peacefulparenting.com/

Join the PREMIUM philosophy community on the web for free!

Also get the Truth About the French Revolution, multiple interactive multi-lingual philosophy AIs trained on thousands of hours of my material, as well as targeted AIs for Real-Time Relationships, BitCoin, Peaceful Parenting, and Call-Ins. Don't miss the private livestreams, premium call in shows, the 22 Part History of Philosophers series and much more!

See you soon!

https://freedomain.locals.com/support/promo/UPB2022
Transcript
00:00Well, good morning, everybody. Hope you're doing well. Stefan Molyneux from Free Domain.
00:04So, a couple of interesting questions have come down the pipe. One is something that I've had
00:11recur over the course of the show, and it is this. Somebody says, well, Steph, you're pretty
00:21pronatalist, but how do I deal with the possibility that I have genetic issues that could transmit
00:29itself to a child? And I find that a very interesting question. I'm very sympathetic
00:37to the challenges of that question, of course, right? But it is a very interesting question
00:41which brings up the relationship between reason and probability. I don't think there is much of
00:49a relationship between reason and probability. So, with regards to moral questions, of course,
00:58I mean, I've got this in the art of the argument. My book, the first is deductive reasoning,
01:05which is absolute, right? All men are mortal. Socrates is a man, therefore Socrates is immortal.
01:11And the other is inductive reasoning, which is probabilities. If you see a woman,
01:18you know she has 20 cats, right? She lives next door. She's got 20 cats. You see 19 of her cats
01:24are white with a dark spot on their chest, all 19 of them. You can assume without 100% proof
01:33that the 20th cat is the same, right? She's got a fetish or a preference for that kind of cat,
01:40right? So, you can't know for sure, but it's likely, right? If you had to bet, right? If you
01:45had to bet. I mean, when it comes to pattern recognition, there's this horrible thing in the
01:51modern world. It's truly horrible. The modern world is all about disarming. It's all about
01:56disarming the righteous. It's all about disarming the righteous. So, you see this all the time
02:04in movies and TV shows. It's the counter signal. It is the anti-reasoning, anti-deductive reasoning.
02:14So, for instance, if there's a woman, she wants to catch a bus at midnight. Then she sees a guy,
02:24I don't know, he's got a swastika carved into his forehead. He seems kind of twitchy. He's
02:31dressed like a punk and so on. And she's got this instinct to not take the bus,
02:39to wait until this guy leaves to grab an Uber or a cab or wait or walk or something like that. So,
02:45she's got this instinct to avoid him. And in just about every modern movie and TV show,
02:55her instinct is absolutely wrong. It turns out, don't you know, she's just prejudiced.
03:02She's got these cliches and these stereotypes in her mind. And it's just wrong. He's really
03:11the nicest guy. And you see this all the time. Guys with weird tackle bait hooks on their face,
03:18and they're really nice and sweet, and they like to help her move and so on.
03:23So, this is the disarming of your instincts. All of your instincts are prejudicial. All of
03:31your instincts are prejudicial. All of your gut sense is bigotry and so on. And what they're doing
03:40is they're counter-programming you to disarm you. So, if you look at how teenagers are portrayed,
03:50this is a constant theme. How are teenagers portrayed? Well, teenagers are portrayed
03:57that the nerds are super, super, super nice people. They're kind of disparaged and excluded,
04:06and the nerds are just really nice and thoughtful and caring, like the Anthony
04:12Michael Hall thing, right? And the jocks, the athletes and so on, well, they're just mean,
04:21terrible, awful, wretched bullies, right? That's just pretty much a constant of media,
04:29and has been for, I don't know, I mean, I think Revenge of the Nerds and so on.
04:34This goes back to, I guess, Eddie Haskell, who was the rather skeevy friend of the Leave It to
04:39Beaver brother. And he was portrayed as pretty, pretty askance, right? And bullies are always
04:48just mean and terrible and so on. They're not a reaction to degeneracy or dysfunction or,
04:55you know. I mean, if you look at, in my experience, and again, it's just anecdotal,
05:02but anecdotal doesn't mean invalid, right? So, in my experience, the jocks, and I spent a fair
05:10amount of time around the jocks because, I mean, I wasn't a jock myself in particular, but I was on
05:16the swim team, the water polo team. I was on the cross-country team. I played soccer and squash
05:21and tennis and baseball. And, you know, I was never at any particular elite level. I guess
05:25I did pretty well in swimming. I was seventh fastest in Ontario back in the day. But I wasn't
05:32like, you know, the letterman jock, right? But the athlete guys were always really nice.
05:41And being an athlete is often associated with higher intelligence. It's often associated with
05:49a little bit of conformity, for sure. But it's often associated with, you know, obviously,
05:53to be an athlete, particularly in team sports, you need good social skills. And you also need
05:58to be able to manage your aggression. So, you need to be aggressive in the game and shake hands
06:05afterwards. So, you need to both have aggression and manage aggression. So, there's this
06:12countersignaling that, and, you know, to be honest, if not a little too brutal, do the school shooters
06:18come from the football team, right? Or are they the nerdy, weak, excluded people? So, you're
06:28countersignals all the time, that the purpose of media is to instill disarming anti-instincts
06:35in you. In that the pretty girls are cold and mean and nasty, right? That sort of mean girls thing.
06:43And the plain girls are nice and thoughtful and lovely and wonderful. And this goes
06:49to Clerks, right? The Kevin Smith movie. And I have never found the pretty girls
06:57to be particularly nasty. I mean, I do see them as a little aloof for sure, but that's natural
07:04because we live in a society where the pretty girls stay pretty for 20 years or more, right?
07:11Obviously, right? They stay pretty forever and ever, amen. And in the past, like the really pretty
07:18girls were supposed to get married off in their teens, have a bunch of babies, and, you know,
07:24what's that horrible line from Raging Bull? You ain't so pretty now, right? I don't mean this in
07:29any negative way. I'm just saying that beauty was supposed to be like a Lucifer match. A Lucifer
07:34match is a giant match that you use in theater, so the people in the back can see that you're
07:37lighting a match. It was supposed to burn bright and short. That was the purpose of, and this is
07:44why the beauty is so intense, is that it's supposed to, you know, you're not supposed to get
07:49Botox and face sanding and whatever the hell they do. I saw this one the other day where they put
07:55five needles deep into your cheek. It's like it's revolting. You're not supposed to be this
08:01biochemical cyborg of plastic surgery into your 50s. So the pretty girls can't be too friendly
08:09because the guys take their friendliness as invitations to become attached, right?
08:17I remember the prettiest girl in school, a very nice woman, a girl, I guess, back in junior high
08:23school. She and I became friendly, and she went to Florida with her family, and I paid her $5
08:32to buy me some shark jaws because I was really into sharks at the time, and she did, and I did
08:37ask her out, and she was very polite about it and so on, but she didn't want to go out with me,
08:42and so not, I mean, I've never found them to be nasty. I mean, certainly not more than the
08:49average, right? So all the attractive people are mean and nasty and vicious, and all of the
08:56losers, outcasts, and excluded are warm and kind and wonderful, and it's just not true.
09:02It's not the exact opposite, but that is not true. That is not true. But people consume so
09:12much media that their empiricism is propaganda, right? What they think of as real is just other
09:20people programming them. It used to be the function of theology, now it is the function of
09:26leftist ideology. So we work with probability, right? If you see a shark in the water,
09:35then you probably, like a, not a nurse shark or something, right, but something that's dangerous
09:40to, like a bull shark or carcharodon carcharus, hey, I told you I was into sharks, a great white
09:45shark or something, even a blue shark could be, but something that is very aggressive and will
09:51eat a human. If you see a big great white shark in the water, I mean, unless you're literally
09:57going shark watching, right, in which case I hope you're in a cage, but you don't get in the water.
10:02Now you could say, well, but, you know, the odds that the shark is going to eat me are pretty low.
10:09I mean, he might have just eaten, but you play these odds, right? Always you play these odds,
10:16right? If you're walking in the jungle and panther is following you, you'd probably be a
10:22little nervous, you'd be a little cautious, right, or very cautious, but you could say,
10:26well, he's just curious. I'm sure he's just eaten and he's just curious, right? It's the old thump
10:32in your house in the middle of the night, right? I remember once living in a new house that was
10:36settling and it was creaking and growing like the hold of a pirate ship and, you know, the thump at
10:41the house, you know, odds are almost certain that it's nothing, but do you take that risk?
10:47Now, philosophy has developed deductive reasoning, but evolution has created
10:59inductive reasoning because most of evolutionary choices are about inductive reasoning. So, if
11:07you want a child, if you meet a woman who's 40 and she hasn't had a period in six months,
11:16maybe she lost a bunch of weight or something like that, then you're not going to get a kid
11:21out of her. Like, you have to go because the choices are binary, but the reasoning is inductive,
11:28right? Or the instincts are inductive, right? So, if a bear is running towards you in the woods,
11:36you're scared. However, you could say, well, the bear is just curious, really curious or whatever,
11:40right? And it could be, probably not, but it could be. But survival means that you have to play
11:47the caution side of the deck, right? The caution side of the hand you're dealt with. You have to be
11:53overly cautious. People who were not cautious died at a higher rate than those who were cautious.
12:00Now, those who were too cautious, overcautious, ended up kind of paranoid and unpleasant and
12:05maybe people didn't mate with them or maybe they were so stressed that they had heart attacks. I
12:09don't know, right? So, all of that is pretty foundational. So, philosophy is not about,
12:19moral philosophy is not about inductive reasoning. Moral philosophy is not about probabilities
12:27because moral philosophy is particular to humanity, to human beings, and yet all animals
12:34deal with the question or the problem of probability. So, for instance, if you've ever,
12:40I mean, if you have kids, what do kids do? They try to feed squirrels and chipmunks, right? And
12:47you can see the chipmunk or the squirrel, if it's a wild chipmunk or squirrel, which I guess they are,
12:52peanut excluded RIP, if you see your kids trying to feed the squirrels, you can see the squirrels
12:59trying to calculate. They want the food but they're afraid of getting caught. It's the same
13:03thing with birds. I remember being in northern Ontario with my daughter with a plate of french
13:09fries and we were trying to feed all the seagulls, right? So, the seagulls wanted the french fry
13:19but the seagulls were afraid of being caught. So, they are weighing probabilities. I mean,
13:24animals as a whole spend a lot of time weighing probabilities, right? A lion chases a zebra and
13:30if the zebra runs really quickly or gets too much of a head start, the lion might run for a few
13:36seconds and then calculate deep in his instinctual sense, his gut, that he's going to expend more
13:43energy trying to catch the zebra relative to what he's going to get, right? Or, you know,
13:50he's going to risk tripping, the ground is too uneven, you know, if he breaks his leg,
13:54that's it for him as a hunter, he's just going to die in agony. So, he creeps up close and he
14:00weighs the probabilities, well, if I get any closer, they're going to smell me and run away,
14:05but if I'm this far away, it's going to be really hard to catch them, you know, all of this kind of
14:09stuff, right? So, lions, I mean, we can see this all over, all over the place in nature, right?
14:17And so, they're constantly working with the inductive reasoning, so to speak. Now,
14:24we wouldn't call it formally that. So, if the woman of your dreams, let's say you want three
14:31kids and the woman of your dreams is 35, right? You meet her, she's 35, right? Well, if she's 20,
14:38you might still not get your three kids, right? Because a 20-year-old can be infertile and a
14:4535-year-old could conceivably, boom, boom, get you three kids, right? So, you have to play the
14:53odds, though. If you want three kids, you're better off going with the 20-year-old than the
14:5935-year-old. And if you are concerned about, again, this is not medical advice, this is just
15:07my vague memory of it, so don't take anything I say with any seriousness at all, but if you're
15:12concerned about the genetic health of the fetus, you can get a sample, but that means piercing the
15:18amniotic sac, which has risks to the baby. So, we all have to weigh these probabilities, right?
15:25I mean, I enjoyed and found it important to do politics for many years, and then the cost-benefit
15:37changed, and I no longer found it as valuable. Like, every time you drive for something that's
15:44not essential, right? Like, heaven above help us, my family, we all drove into Toronto to go
15:53to Casa Loma. Now, that was not an essential trip at all, and we risked dying in a fiery car crash.
16:00Well, maybe not fiery, because traffic was moving at a snail's pace, but that was the reality, that
16:07this was a non-essential trip, and we risked death in order to see Castle. When you fly,
16:15non-essential, right? You understand, right? So, we're all weighing these things.
16:20If you want to gain a lot of muscle, then you may exercise to the point where you get injured,
16:26right? But these are all things that animals do, and because it's things that animals do,
16:33it is not the province of moral philosophy. So, when people say to me,
16:39what risk should I take? That is not the job of a moral philosopher. I hope that this is,
16:46hopefully, not too long away of explaining why I understand why people ask me this, for sure.
16:52I really do, but it's not an appropriate question for a moral philosopher, because a moral philosopher
17:00will tell you good and evil, right and wrong, in absolute terms, right? Rape is absolutely evil
17:08and wrong. Theft is absolutely evil and wrong. Assault is absolutely evil and wrong, and murder
17:13is absolutely evil and wrong. So, there's no ambiguity there, but in terms of what risks
17:21you should take, that is a matter of a cost-benefit analysis. Now, a cost-benefit analysis can lead
17:28you to great evil, right? So, amoral or evil or morally repulsive men might say, well, I'm not
17:38having any luck getting a woman to mate with me, right? So, this evil guy would then choose
17:45to rape. Now, does this pass along his genetics? Well, not very well, because his victim will not
17:53want to care for the offspring, but it's a higher chance than zero, and it's zero if you can't get
18:00anyone to mate with him, right? So, that's a cost-benefit analysis at a biological level that
18:04leads to the great evil of sexual assault and rape. It's the same thing with theft. A theft
18:11is generally pursued by people who are unloved, because if you're loved, you just ask people for
18:18things, right? Freedomain.com slash donate. Show me the love, show me the love, right? So, no,
18:24if you're loved, right, the people who are homeless have burned every bridge in their life,
18:29right? There's nobody who wants to take care of them anymore. There's no couch for them to crash
18:34on. There's no one who'll give them a job, like they, you know, maybe they're addicts or other
18:38people with dysfunctions who, there's nobody left to love them. And so, people steal or end up in
18:44these kinds of situations because they are unloved, and love is the great shield against these kinds
18:49of misfortunes and disasters. So, it's like the question that people could ask me and say, well,
18:56should I start my own business or should I work for someone else, right? So, I mean, if you're a
19:03male, particularly if you're a white male, you know, you may have some difficulties getting
19:07hired. So, maybe it's better for you to start your own business and so on, right? So, I can
19:14remind people of the various factors involved, but I can't tell anyone what to do. So, if people have
19:21really messed up parents that are putting them down and so on, right? Obviously, it's not immoral,
19:28it's not evil to be in contact with abusive people. It may be immoral to put your children
19:38under the care, quote, care of abusive people, like if you have abusive parents and they babysit
19:44your kids and they yell at your kids or hit your kids, that could be immoral, or certainly the
19:48hitting, yes, because you're delivering them unto evil. But if you, yourself, and I've said this
19:54before, like you don't have the right to put your children in abusive situations, but now you,
19:59yourself, it's your choice. If you want to spend time with abusive people, I don't recommend it,
20:06but it's not a moral question like good and evil. It may be a functionality question,
20:11it may be a happiness question and so on, but it's not a foundationally moral question.
20:16It certainly is a question of love if people care about you and then don't seem to care that
20:23you spend time with people who put you down or insult you. Well, that's a lack of love, right?
20:28That's a lack of love. So I can point out the costs and benefits, right? So I can say, well,
20:35if you spend time with abusive parents, that's going to really hamper the quality of the man
20:42or woman who's going to date you, right? It's going to affect your self-esteem and your confidence,
20:46like it's going to have negative effects on things, right? It's like the doctor will tell you
20:52if you keep smoking a pack a day of cigarettes, you're very likely to get sick, like 50% of smokers
21:00die from smoking, right? But he can't knock the cigarette out of your hand all the time.
21:09So when people come to me for a moral answer, I will give them the moral answer with great
21:14certainty and hopefully some vivacity and convincibility. But when people come to me
21:22with a cost-benefit analysis, I will point out the costs and benefits, but I won't tell them what to
21:26do because the costs and benefits have to be weighed within the mind of each person, right?
21:33So should you spend time with abusive parents? Well, let's say that your father is dead, your
21:39mother is on her deathbed, and you are going to inherit $10 million and you are going to devote
21:46that to the spread of peaceful parenting. Does the cost-benefit mean that you go visit your mother a
21:53couple of times on her deathbed and not confront her about the wrong that she's done, but instead
21:58take the money and do some good with it? Given that it's not immoral to go and see your mother
22:02on her deathbed, and given that you can do great good with the money, this is not a crime and
22:06punishment Raskolnikov situation, right, which is he kills a porn broker and her sister to get
22:12money to, quote, do good, right? So you can see the cost-benefit. So I can't tell people what to do
22:20with regards to cost-benefits. I can tell them what to do with regards to morality, sure, yes,
22:25but that's UPB, right? But I can't tell people what to do with regards to weighing costs and
22:30benefits. And when you have a cost-benefit, such as I have a genetic disorder that has an x
22:39percentage chance of transmitting to my children, should I have children, that is not a moral
22:45question. Now, obviously, if it's a 99% chance that your child will die before six months of age,
22:55that would be a pretty gruesome thing to go through, and obviously the odds, I mean, that's
22:59an easy decision to make, not a pleasant decision to make, but it's like, that's no good, right?
23:05If it's a 1% chance that your child might have eczema by the age of 50, well, that's, you know,
23:11slightly different cost-benefit and, you know, very different odds and so on, right? So
23:16I can't give you that answer. I think it is important to take all of the factors into
23:22consideration, but it's like trying to design policy, government policy, based on
23:29cost-benefit analysis rather than morality, right? Well, if the government takes $5 million
23:35and creates 50 jobs, there's 50 jobs. Ah, yes, but what about all the jobs that weren't created and
23:39so on? It's like, well, how about the government just doesn't take their money in the first place,
23:42that's the moral answer, right? But yeah, cost-benefits, not the province of moral philosophy,
23:47and really the only job that philosophy can do is not tell you the answer to cost-benefit
23:51calculations, but to remind you of the various factors and stakes involved so that you can make
23:55a more informed decision. But the decision, of course, finally has to be yours. All right,
24:00I hope that helps. freedomain.com slash donate. If you would like to help out, I would really,
24:04really super-duper appreciate it. Have yourself a wonderful day. Lots of love. We'll talk to you
24:08soon. Bye.