This discussion explores the ethical justifications for slaughtering animals, specifically questioning why it is permissible to kill animals for preference while it is not permissible to kill humans, even those with diminished capacities. The conversation delves into various ethical frameworks and criteria, ultimately struggling to establish a consistent and universally accepted moral distinction.
Mind Map
Click to expand
Click to explore the full interactive mind map • Zoom, pan, and navigate
So uh just just to reiterate because I
haven't recorded earlier uh what do you
think is a good justification for
slaughtering an animal for like just
like your preference?
>> Um ethical and cultural reasons. >> Okay.
>> Okay. >> Simple.
>> Simple.
>> Uh what are the ethical reasons for
slaughtering an animal for your like
taste pleasure?
>> Um overpopulation for instance. Um I
used to hunt boar. Um, hunting boar is
pretty nice. Um, you will it's not
factory farmed.
Um, sure it bleeds for a second, but
after that it's pretty clean.
>> Okay. So, if uh uh I kill you pretty
clean, is that okay ethically?
>> Well, the question is, am I a boore?
>> Uh, no. You're you? >> Yeah.
>> Yeah.
>> Yeah. So, it's okay for me to kill you
if you bleed just a little quick. Sorry.
Hang on, hang on, hang on one second,
one second. I I did have a follow-up
question, though.
>> Yeah. The question was,
>> "Is there a separation between me as a
moral agent in your society?" >> Mhm.
>> Mhm.
>> Or the boar,
>> or is a boore a philosophical pea
zombie, if you will? >> Yeah.
>> Yeah. >> Zombie.
>> Zombie.
>> Those are different creatures, but I'm
asking what is the difference? Hold on.
What is the difference such that make it
okay to kill the boar and not you? If
it's really quick and humane, like
really clean, you bleed for a little
bit, but then it's done.
>> Well, the symmetry breaker here is um
well, animals for all intents and
purposes are going to be what I consider
not moral agents, right?
>> Why does that matter? Babies are not
moral agents. Why does that matter?
>> Why does that matter?
>> Yeah, babies are not moral agents. Is it
Um, babies are considered moral agents
technically because well they're under
society's protection. No.
>> Oh, well they're society protected.
That's for sure. But in what way are
they considered moral agents? Do babies
know right from wrong?
>> Not necessarily.
>> Yeah. If you give a baby if you give a
baby a gun, uh, is it going to know not
to use the gun on like not to point a
gun at gun at his its parents? >> No.
>> No.
>> Yeah. There you go. So babies are not
moral agents. So it's okay to slaughter
babies, right?
>> I mean, your reductive doesn't land for
a reason.
>> Well, as well,
>> I can tell you that.
>> Yeah, sure.
>> I mean, simply put, I mean, we have um
constructed social norms. >> Mhm.
>> Mhm.
>> So, if those social norms are
constructed, right? >> Yeah.
>> Yeah.
>> U we get to pick and choose what are
moral agents and what falls under
protected class.
>> Well, that doesn't matter. Babies babies
are not moral agents. I know right from
wrong, but eventually they're going to
be a moral agent.
>> Well, sure event well,
>> but the boar will never be a moral agent.
agent.
>> Yeah, they're not going to become a
moral agent if you kill them really
really early before you they get to be
moral agents. So, they are protected.
>> Abortion is a thing, isn't it? >> Sorry.
>> Sorry.
>> Abled people.
>> Well, abortion is still a thing, isn't it?
it?
>> Yeah. Well, abort usually babies are not
aborted, right? Like after they're born
and stuff when we actually can call them
a baby.
>> I do have a question. And so if you have
a problem No, no, no, no, no. Let's No.
Let's No, no, no, no, no. Let's stay on
this. No, no, no. We're not going to go
No. We're not going to go into a
different uh place, right? We're going
to stay on this. You made the claim,
Hold on. You made the claim that uh
because animals are not moral agents,
it's okay to slaughter them. Babies are
not moral agents. Why is it not okay to
slaughter babies?
>> Babies are not moral I mean, I did
basically just describe them as moral
agents technically.
>> Well, you describe them as such. That
doesn't make them moral agents. If you
give a baby a gun, it's not going to
know not to shoot its parents.
>> Yeah, it's going to be exempt.
>> Uh yeah. Well, moral agency then is not
the trait that makes it okay to
slaughter somebody.
>> So, what's the what's the morally
relevant difference between a boar and a
baby that makes it okay to slaughter a
boar but not a baby?
>> Under societal norms, it's a constructed thing.
thing.
>> So, uh Yeah. So, so can Yeah. Yeah.
Yeah. Yeah. Hold on. Hold on. I
understand societal norms. Can societal
norms be wrong?
>> Can I can I answer?
>> I now that I've asked uh uh an actual
question. Sure. Can societal norms be wrong?
wrong?
>> Um yes. Because societ changing.
>> Awesome. So it's not a good uh
measurement of morality, is it? If they
can uh can be wrong or they can be
right. Well, it's the only measurement
of morality. >> Sorry,
>> Sorry,
>> I said no, it's the only measurement of
morality technically.
>> Well, uh, if they can be wrong, then how
can it be like a measurement of morality
if literally societal norms can be
>> it's not necessarily wrong or right in
the traditional sense of getting, you
know, a binary question right or wrong
about like, you know, external affairs.
>> Oh, awesome. If we find societ society
where it's not a societal norm to not
slaughter babies, then it's morally
>> Um there's no universal law against it.
However, I would
>> We're not talking about laws. We're not
talking about laws. We're talking about
morality. So if we find a society,
right, like sentinel people or something
like that, uh it's a society. Hold on.
Hold on.
>> Hold on. It's a society where maybe it's
like totally not society protected.
>> Clarifying question.
>> Yeah. Go on.
>> I'm asking a clarifying question for a
reason. Are you propo presupposing a
moral realism of some sort?
>> No, I'm a moral anti-realist. I'm just
asking you on your morality. If a society
society
um would you be okay on your moral
stance if a society doesn't have the
moral norms to not slaughter babies for
them to slaughter babies?
>> Um
>> Jesus takes you a long time to answer this.
this.
>> My if it's my kid and the firing line,
no, I'm not going to be fine with it.
>> Shifting the goal post. So uh yeah
shifting your goal both and also oh I
guess societal norms are not a
measurement of morality
>> frame I'm thinking of it in the frame of intuition.
intuition.
>> Yeah. So societal norms are not a
measurement of morality are they?
>> Well you're putting me in a society that
is contrary to my current one.
>> There you go. So societal norms
>> particular things. Societal norms are
not are not good indicator of what is
moral and what is not. Because if I
plunk you out of your society and plunk
you into a different society, you and
your child, then your child is [ __ ]
>> You find that immoral.
>> Well, we're basically going down to
intuitions now, right? So, like what
what about your intuitions are superior
than mine?
>> Uh, I didn't make the claim that my
intuitions are superior than yours. I'm
just asking I'm just No, no, no. I'm
just asking you what morally justifies
for you to slaughter a boar but not a
baby. societal norms are not that you
made that claim but apparently if uh if
societal norms don't exist on that you
still hold on hold on I'm asking a
question can you let me finish
>> I just want to be made very clear so far
we've done this I asked you what is the
morally irrelevant difference you said
moral agent and when I said babies don't
have moral agency you said no it's still
not okay to slaughter babies and then
you said societal norms is the
difference but when I said I'm going to
pluck you into a society that doesn't
have those norms you still have a moral
issue with it. So what is the morally
irrelevant difference between you and a boy?
boy?
>> Symmetry breaker.
>> Yes. Because the symmetry breaker is
very much the same as you know why would
it kill another person versus a animal,
right? Person.
>> Yeah. Different ontologically from maybe
a deer, a bear, you know, etc. Right.
>> What do you mean by different pathologically?
pathologically?
>> What's different?
>> Yeah. What do you mean by different pathologically?
pathologically?
>> Um the difference is going to be in many
things. nervous system, cognition. Um, >> sure.
>> sure.
>> In many cases, I would argue that it
comes down to intu intuition because of
these various factors, right? So, like
let's say a dog is being attacking a
human. Even if it's my dog, I'm probably
going to have to shoot my dog
>> um that's about to rip out some kid's
throat, right?
>> Yeah. Sure. That's just about
>> So, uh, let me ask I understood your,
uh, point. So, let me ask you a
question. Uh, hold on. So, you said uh,
so pathology, so nervous system, right?
So uh what's the uh evidence that a
pig's nervous system is different from a
human's nervous system?
>> Um the very fact that we have um higher
cognition. I mean the CNS is different.
>> No, CNS is the same. We have higher
cognition for different reasons. CNS is
exactly the same in a pig and a human
being. Both are sentient.
>> Um let me look that up.
>> Yeah, sure.
>> Are you just running from name to trait?
>> Uh yeah, I'm trying. So the uh the guy slippery
slippery
>> um human central nervous system CNS is
more developed particularly in um the
cerebral cortex.
>> Okay. Cerebral cortex. Okay. I guess the
brain is part of CNS. So what why does
it make it okay to kill a bo? Because
it's uh uh cerebral cortex is less developed.
>> Um again I listed like several reasons
for the bore. None of them uh sorry none
of them cashed out into it being okay to
slaughter that boar. So I don't
understand if a human being would have
like underdeveloped cerebral cortex
right so on the marginal cases somebody
who was born mentally challenged would
that make it okay to slaughter that uh
human being?
>> Um would it be okay to slaughter that
>> No. So cerebral cortex is not the
difference trait that makes it okay to
slaughter a boar but not a human being.
What is the different trait? Difference trait.
trait.
>> Um the difference trait again the
different trait is not just one
particular thing.
>> Okay. Well, we killing one after the
other. So we killed societal norms, we
killed moral agent and now we killed
cerebral cortex. None of those reasons
make it okay to slaughter a pig. Give me
another reason.
>> This is where this is where we use
particularist uh ethics though. No, no,
no. Let's not go into meta. Please give
me Why is it okay morally to slaughter a
boar but not a baby?
>> Hang on. But your meta justifications
are important.
>> No, I don't give a [ __ ] about meta. I'm
asking you about your morality. The two
of us can have different matters. But
when I'm asking you about your morality,
just give me a justification from your
matter, from your point of view.
>> So we don't have to go into meta for you
to tell me what your applied ethic is.
So please provide your applied ethic.
Look, my friend, look,
simply put, the reason I'm telling you
particularly, right, this is why I'm a
particularist, right?
>> I don't give a [ __ ] No, no, no, no.
Please answer my question. What is the
moral? No, answer my question.
>> Pig and a kid with Down syndrome. I'm
not going to shoot the kid with Down
syndrome just because they both have a underdeveloped
underdeveloped cortex.
cortex.
>> So that's not the difference. What is
the difference? What is the difference?
That's not the difference. We already
reached this conclusion. What is the
trade difference?
>> Participate in society.
>> The pig, right, in this case
>> is an invasive species in North America.
Okay. Right.
>> Okay. Sure. So,
>> so I mean I have many reasons to shoot
the pig. >> Okay.
>> Okay.
>> Not the kid with Down syndrome.
>> Okay. Awesome. So, if uh a bunch of uh
kids with Down syndrome uh were invading
the fields, you could stop them in a
different way, right? Would you shoot
the kids with Down syndrome because
they're like invading your fields?
>> Yes or no?
>> Um probably not, right? Because
>> Oh, yeah. So, that's not the trade
difference, is it?
>> The kids, we can put the kids with Down
syndrome back to school.
Um, however, stopping a invasive species
is a lot harder than stopping a bunch of
>> kids. You heard of fences? Ever heard of
fences or
>> tree equaliz?
>> Have I ever heard equalized? So, let's
say uh we could
>> in particular bur burl um under um you
know, fence posts, etc. I mean, they're
pretty crafty creatures. Yeah, it just
seems like uh you like if the kids with
Down syndrome you couldn't actually like
create barriers and you couldn't put
them in uh schools, would you like just
like slaughter them?
>> Well, no. Like I I did want to ask like
a counter.
>> No, no, no, no. Please answer my
question because we're going one by one.
This question goes two ways.
>> No, no, no, no, no. Here we are asking
why you are not vegan. So, we're
examining your morality. So far all the
reasons that you gave so far let me finish.
finish.
>> So far all the all the reasons that you
gave for slaughtering pigs but not
slaughtering humans didn't pan out
because when I it equalized it to
humans. You said no I wouldn't kill that human.
human.
>> In philosophy we don't have moral
discussions in just a one-way street.
Right? Questions are there to clarify.
Well, you you you can ask a qu
clarifying question, but it seems that
you just want to ask me meta questions,
and I'm not gonna abide this.
>> Well, no, I want to ask you I want to
ask you a particular question, right? >> Okay.
>> Okay.
>> So, like earlier, we're going down the
path, right? Um, so essentially, you
don't want to kill things based off
intelligence. But if that's the case,
why do you abort why would you be in
favor of abortion in a sense?
>> Uh, so first of all, you're assuming
that I am in favor of abortion, right?
wouldn't first be with the question, are
you in favor of abortion? Right.
>> Are you are you? But are you though?
>> Okay. So, uh I see no moral uh problem
aborting a fetus that isn't sentient. I
do see a moral I do see a moral problem
with aborting a fetus that is already
sentient. But when it comes to choosing,
and it's a hard choice between a fully
sentient uh woman who has to carry that
fetus and for whatever reason cannot or
refuses to, then I would choose the
fully sentient woman and not the
partially sentient fetus because the
fetuses gain the structures of sentience
at 19 to 23 weeks. Most of the time
though, uh yeah, let me finish my
answer. Jesus, >> right?
>> right?
>> So most of the time though those fattors
are aren't actually in a sentient state,
but even if that they were, I would
consider it a tragedy. But I would
choose the fullyfledged already sentient
person who had a lifetime of experiences
uh like not dying, not experiencing
actual torture, then that fetus. That's
my position.
>> Okay. So if that's the case, right? >> Mhm.
>> Mhm.
>> And I'm looked down upon in quite a few
instances for evaluating intelligence or
>> No, I didn't evaluate any intelligence.
Well, don't put words in my mouth. I
didn't say intelligence is the issue.
>> Well, no, you clarified what's the
symmetry breaker between a kid with Down
syndrome and a pig, right? >> Yeah.
>> Yeah.
>> Um, basically to break down, you know,
where I would choose a person with
lesser intelligence versus the pig,
right? That's That's not a point in
question though. Wait, wait, wait, wait.
That's not That's not the symmetry
though there, right? I'm not asking you
if you the only thing that you can
choose from is either shoot a pig or a
kid with Down syndrome. I would
understand why you would say, well, I'll
shoot a pig, but that's not the re
situation that we are put Hold on. Hold
on. We're not That's not the situation
we are putting you in. We're asking you
why you shooting a pig. The kid with
Down syndrome is only there to like
analogize instead of the pig. It's not
right? That context is important. We're
not You're not killing the [ __ ] pigs
I know I'm not, but like I'm not my son,
right? I said you were asking me a
symmetry breaker between the kids with
Down syndrome in a pig, right? And I'm
basically just trying to drill down like
again, if intelligence isn't supposed to
play a factor, why can't I uh treat Kale
or Adore as a moral agent? >> Sorry.
>> Sorry. >> Right.
>> Right.
>> Excuse me. Or why value something
without sent? Why would you value something
something
>> without sentience? I don't I don't value
something without
>> I know you don't, right?
>> So why would you value something without sentience?
sentience?
>> To me that doesn't doesn't become like a
symmetry breaker because many people
value things without sentience yet. >> Well,
>> Well,
becomes a double homicide.
>> Yeah. Yeah. Yeah. Usually uh there are
some cringe people who would say, "Oh my
god, plants are like uh I don't know
like uh if I had to choose between uh
like uh killing a a rose or killing a
dog, I I literally don't know what I'm
going to do." But most people know that
uh plants are not sentient and they
don't value plants for sentience. They
might value uh or like for its own,
right? They might value plants for what
it can do for sentient beings. For
instance, I will be upset if you break
my phone. Not because the I care about
the phone itself. It's not sentient.
It's because I I spent money on it or
whatever. So, I as a sentient being, I'm
going to be upset if you break my
[ __ ] phone. Right. So, I'm not
valuing the phone. Nobody values the
actual non-scentient things for that
being or for that uh thing. >> Correct.
>> Correct.
>> Correct. So sentience is the actual real
uh uh like symmetry breaker between what
we should value for itself and what we shouldn't.
shouldn't.
>> I mean I kind of question that though
right because if we can place like a
superficial let's say value on something
like a phone or even plants.
>> Yeah. For not not for itself for me for
the sentient being in that thing's life
>> or not life. I mean I mean but like this
this boils down to something simple to
me then right if in society or in many
cases where people value non-scentient
things I mean sentience isn't the only
marker of moral worth
>> it is
>> um in my opinion
>> I mean you shouldn't hold on if there is
a phone that nobody owns literally
nobody it's like I don't know just like
uh lying just like a phone lying
abandoned in the middle of a of a desert
and you like sit there and wait for like
two weeks or a month for somebody to
come back for it, right? Nobody comes
back. Is it wrong to destroy that phone?
>> Is it wrong to destroy that phone? Um, I
mean, no one's coming for it, so I guess
why not?
>> Exactly. So, you don't value the phone
itself. You don't value the
non-scentient thing. You value it only
if some sentient being wants it. That's it.
it.
>> Yeah. Correct.
>> There you go. So, sentience. So,
exactly. So sentience is a moral
symmetry breaker.
>> No. Well, not necessarily.
>> Well, give me an example where it isn't.
>> Sentience produces value by having a
subjective experience. >> Exactly.
>> Exactly.
>> However, we're talking about um
>> what gives things value, right?
>> Uh so no what makes society chang no
>> to sacrifice children.
>> No, I mean that's not the question.
You're stunning. So uh maybe not uh not
stunning but like that's not the
question. The question isn't what do we
value the question is what makes a thing
uh uh you know like a philosophical
thing morally relevant or not. Sentience
is the symmetry breaker.
>> I have named it. No.
>> Yeah. I think it's sentience. What's the
problem with me making it sentience?
>> It's subjective um experience. I mean yeah
yeah
>> putting placing value on things or
placing value on something it's not
necessarily sentience itself
>> it is it's the subjective experience is
the sentience
>> only sentient things only sentient
things can have subjective experience
something that can have
>> go on
>> if I ran a trolley problem on many
people and put their family on the tracks
tracks
>> and then put 10 other people on the
other side of the tracks
>> nine times out of 10 I reckon and the
person's going to pull the lever and
save their family.
>> Yeah, sure. Why does that
hold on the uh that's still a sentience
uh um that they value here? They just
value some sentient beings more than
others. If you make the trolley problem,
I put a bunch of phones on one side and
I put like a like um a dog that I don't
know, like some just some dog uh on the
other track. Most people are gonna drive
>> Yeah.
>> Yeah. So the symmetry breaker is sentience.
sentience.
>> Yeah. But however you realize though in
my example um one of the one of the key
markers here is sentience
>> from you know particular people are
valued more than others.
>> Sure. As like I gave you the example,
right? Wait, hold on. Hold on. I gave
you the example and I beat the bullet
for you with you, right? If it was, oh
my god, you have to shoot one. You have
to odd world ends. You have to shoot a
wild boar or or like uh um I don't know
some kind of a child, right? It doesn't
matter if it's mentally challenged or
not mentally challenged child. Let's
let's make them actually like um
relatively the same, right? So you have
hold on hold on hold on.
>> So in the question, right? In the
question, we put two sentient beings in
the same place. You can value one over
the other, but sentience is still a
symmetry breaker because when you put a
non-scentient thing, that thing doesn't
even go into the like the moral consideration.
consideration.
If it's like cut a rose in half or cut a
woman in half, you would say, "Yeah,
obviously the moral moral thing is to
cut the rose in half because it's not
sentient. Even if the woman you don't
know her, you don't give a [ __ ] about
her, it's from like a country that is at
war with your country or something like
that, right?
>> So, if I stab a pregnant woman, right,
technically under this moral system, it
shouldn't count as homicide. >> Uh,
>> Uh,
>> it should just count as um, you know,
battered battery or very violent assault
or attempted murder.
>> Well, it depends on if the uh, fetus is
sentient. So um but I think like yeah
>> it's not really sentient until like what
the second trimester maybe third
>> uh it's uh 19 to 23 weeks. Yeah. So uh
before that it would be well it that's
where it becomes a little murky because
I guess the woman would like value like
that uh phone for instance right like
the phone and the non-scentient fetus
have like generally probably like the
same moral value at the point but to the
woman it's it's like has like higher
value or something
>> but I don't think it would be
>> homicide no because the fetus may not be
sentient yet but it's valued just the
same as a human.
>> Yeah, sure. A human a human being can
value something, right? But like um
>> and in the case in the ca so that
doesn't really make my symmetry breaker
earlier that the boar is different from
the child um the kid with Down syndrome
any different than like let's say the fetus.
fetus.
>> Well, you can say like I value uh
mentally challenged child over the boar.
That wouldn't be a good justification to
kill the boar because the boar is sentient.
sentient.
>> I mean the only difference here is the
boar is not an actor in society. Why
does that matter if if sentinel people
is it okay? Sentinel people hold on let
me ask you uh like a follow-up question.
So sentinel people are are not a factor
in society. In fact, if you like uh
trying to come closer to them, they're
going to kill you. Is it okay to like
grab a bunch of them, right? If if you
successfully did it without being killed
and just like start farming them or
something like that or just like
straight up like just like shooting them
from uh from the far. Is that okay?
>> I mean, in what way would it be relative
relevant to the pig?
>> Well, because uh Sentinel people don't
participate in society
>> and that's the trait that you gave me.
>> Island is protected by the Indian government.
government.
>> Listen, I'm just like not understanding
what not participating in society, why
that trait makes it okay to slaughter somebody,
somebody,
>> right? That's all I'm trying to
understand. And when I put a human being
in that position, you're flailing now, right?
right?
Not really.
>> Well, are you going to say that it's
okay to kill sentinel people?
>> You're basically just asking me when is
it okay to kill someone versus when it isn't?
isn't? >> Yeah.
>> Yeah.
>> And I can give you like that
distinction. I mean, it's okay to kill a
person in a time of war. >> Yeah.
>> Yeah.
>> That's like the only time I can justify
killing someone unnecess.
>> Yeah. In that context, it's justified.
I'm just saying uh why is it justified
to kill a boy? Because it doesn't
participate in society. Why
nonparticipation in society? Why? Let me
finish asking the question. Why non?
>> Yeah, because
>> you're not naming just one. You're
naming just one trait. I mean, if I
>> I'm not naming any [ __ ] traits. I'm
asking you what is the trait that makes
it morally relevant to kill a boore.
>> Hang on. I mean, that's just one thing
about the Sentinel. Like, if I went up
to the Sentinel Island right now, right,
>> took out my like 45 Magnum,
>> you know, aimed it at someone and then,
you know, pop the person on the Sentinel Island
Island
>> the way you do. kind of like an act of
war in their tribe, you know, or
>> so that's the only reason it would be
not okay to a person randomly just shoot them.
them.
>> Yeah. So that's the only reason it would
be not moral to kill Santin Island
people because they would go to war with you.
you.
>> Why don't you let them finish? I don't
think you finished.
>> Well, no, they made the point.
>> It's a mix of like a bunch of things
again. Empathy for one, right?
>> Yeah. So why don't you have empathy for bo
bo
>> Why don't you have empathy for bo?
>> Why don't you have empathy for
>> Why don't I have empathy for boar? >> Yeah,
>> Yeah,
>> because boar and animal are boore and
person are different. Okay,
>> they're different distinctions.
>> So what's the morally relevant
difference that makes it okay to uh kill
boores but not okay to kill humans?
>> I mean I just named it one human
>> and then it fell apart when I asked you
about sentinel people.
>> Did it though? Yeah, it did because you
said you wouldn't kill Sentinel.
>> I mean, you're just saying it did, but I
don't think it did.
>> Well, would you kill Sentinel people
because they don't part participate in society?
society?
>> I mean, that's just my preference. I
would not kill a sentinel.
>> There you go. So, it's not a morally
justified reason to kill somebody.
>> Um, that is kind of a morally justified
reason. It's falling down to preference
at that point.
>> Ah, yeah.
So, you see this? Here's the problem,
right? You can't even actually make
>> all choices are preferences just like as
I shown in the trolley problem.
>> Yeah. Yeah. Yeah. That's cool. So, but
like it's again like the trolley problem
is not applicable here because the
trolley problem that we're presenting to
you is not like uh put a pig on one side
and put a like a sentinel person on the
other side. It's put a pig hold on.
>> It's put a pig on one side and put
nothing on the other side and you're
still running over the pig.
You're getting agitated for no reason.
>> Sorry, I'm I'm not agitated. I'm fine.
It's just like how I speak. >> Okay.
>> Okay. >> Okay.
>> Okay.
>> So, I mean, what I'm trying to get at
here, right? >> Mhm.
>> Mhm.
>> Is that you kind of are because we're
naming a trait that the pigs have and
the sentinel people have, right?
>> And once you name that, I'm obviously
going to refer back to my moral preference.
preference.
>> Yeah. But I'm not asking you
>> moral preference being the fact it's not
a human. So that's going to be the
symmetry breaker between these two things.
things.
>> Okay. So what is about uh like when you
say human, what do you mean like
genetics or like uh species? What do you mean?
mean?
>> Um again it has to encompass everything.
I mean how you can be slightly different
still be human. It doesn't really matter
too much um on that front.
>> Uh listen I'm gonna uh present something
to you right? Um is it going to be here?
No. Give me a second. I'm going to share
my screen. Uh if you uh want to open
that. So this is not it. So let me give
me a second. Um
Um
no, it's not going to be here either.
[ __ ] I'm always getting lost. Uh common arguments.
arguments.
No, it's going to be in ethics
ethical concept. So um right here,
right? Uh here's the question, right?
The question is not like comparing uh
like for instance in this case like a
cow or sheep or um or a fish to human
beings right this is not what we're
looking for we are looking for here's a
human being right and then um like
imagine you are entering that human
being into like some kind of a machine
and I'm asking you what will this
machine take away from a human being
in the context of uh the same place
where like uh those animals are in,
right? So, we're not going to change the
context. Oh, if it's uh like attacking
me or something like that, that's
obviously a justified context to kill anybody.
anybody.
>> Um but like this human being enters this
like machine and this machine like
removes traits, some kind of traits.
What traits is this machine going to
remove to make it okay to systematically
slaughter uh those human beings for
burgers or for steaks and brain?
>> I mean, you're basically essentially
changing the nature of the person,
right? If you strip every trait >> Yep.
>> Yep.
>> That's what I'm asking you though. What
are the traits that exist? >> Human.
>> Human.
>> Yeah. Brain. If it can if if it's no
longer capable of our level level of of understanding
understanding
>> then it's it's going to be oh sorry you
don't have to mute
>> I mean technically it's irreducible
right again like there's going to be
traits that are you know in one case
like again you can't really know if it's
human or less human I mean we ask this
question all the times in transhumanism right
right
>> well let me let me ask you a question
>> or a person's more machine than
Yeah, I think you like uh the people in
here are not like uh understanding the
actual question. So if even if one trait
of a human being is removed,
then you're saying it would be okay to
slaughter them for stakes.
>> No, not really. Because one trait
doesn't make up a human being.
>> So that's what I'm asking. I'm asking
for a trait or a list of traits that if
removed from a human being would make it
okay to slaughter them for birds. Are
you willing are you willing to remove
the CNS memories DNA structure?
>> I mean sure we we removed cerebral
cortex to equalize.
>> Yeah we removed the cerebral cortex uh
earlier in the human being to equalize
them uh mentally intelligently to the
pig and you said no it wouldn't be okay
to uh slaughter that human being. So
that you have to like
>> so that trait is eliminated, right? That
trait is no longer part of your stack of traits
traits
>> because it wouldn't make you uh kill
that human being.
So I'm looking for like at least like
one trait that would actually cash out
into you killing that human being is
who's not going to be human according to
you anymore. So please tell me what
trait you should remove from a human
being to make it okay to slaughter a
human being for burgers or steaks.
>> Like again I said it's not just one
trait. It's again going to be a
cacophony of traits.
>> Okay. So but you gave me like a bunch of
them and none of them panned out.
>> Like let's say like you take that human,
right? Like I'll give you a quick
example. You take that human, run him
through the machine,
>> take his entire brain out, only his
brain stem is functioning, right? I
mean, at that point, it might be morally
permissible to kill him because now he's
a [ __ ] vegetable, >> right?
>> right?
>> Well, that Do you mean like brain dead?
>> I mean, yeah. I mean, a lot of people,
you know, specify take me off a life
support if I become a vegetable.
>> Oh, sure. Uh, so, but like pigs are not
like vegetables, right? They're not like
brain dead. So brain dead can't be uh
the traits because it it doesn't exist
such in a pig.
>> Well again like I stated before there
more traits that pigs don't have.
>> Yeah. Give me [ __ ] one that would
make it okay to kill a human being the
way you kill pigs
because so far none of them pan out.
Right. Again we keep reiterating.
>> How about this? How about this? How
about this? We take the human, we make
him quadripedal, add pig skin, right?
I mean, there's going to be like a
certain point, right? Even if we go back
evolutionarily, it's going to be like,
um, you know, it's not just one trait. I
mean, there's multiple traits that we're
going to have to go by.
>> Then eventually we get to human, right?
So, like it seemingly is irreducible to
like what point I would consider it
human and not human. >> Mhm.
>> Mhm.
>> Like I don't know. It's like when I it's
like a lot when a lot of people abortion
>> before they consider the the fetus, you
know, sentient or not sentient. Wait, if
you if you don't know this the actual
stack of traits that would make it okay
for you to like slaughter uh human
beings uh for burgers,
shouldn't you [ __ ] like stop
slaughtering pigs for now until you do
know what the trait difference between a
pig and a human being that would make it
okay to kill p human beings the way you
kill pigs.
Surely you can recognize that you don't
have like a good reason for it.
I mean, at this point, like I I kind of
like alluded to it earlier, but like the
pig has no normative properties or let's
just say in maybe this case.
>> Well, um
>> we went through that.
>> Yeah, go on. You can uh interfere
because I'm I'm definitely not making a
>> some type of deontologist view. Um we
can just say like one of our principles
um for like some other reason to give
these animals value, right? To give this
pig normative value, right? And then
under some like virtue ethos theory like
the pig can fill these like specific
virtues like having sentience or like
having the necessary like um neuro
networks and things like that um to like
to like for them to fulfill these
virtues, right? um like like under other
under other under other under other
under other under all these like
normative frameworks these very popular
ones yes a pig will have some type of
normative value if you're holding to
some type of like idiosyncratic
normative view um then you just need to
provide the argument for it right and if
you're an anti-realist like this will
get worse for you right because then
this entire argument will just be like
like you debating this will be trivial
and I think if like you're like an
anti-realist on morality you should be a
quietist when it comes to like um when
it comes to um normative ethics unless
>> Okay. So, you're essentially like trying
to get to the core of it and run the
>> I just want to know under what normative
ethic do pigs
>> you're running companions and guilt at
the end of the day.
>> Am I wrong?
>> Huh? Can you repeat that?
>> You're basically running companions and
guilts essentially for me because I am
an anti-realist.
>> Well, yeah. Well, wait. What kind of anti-realist?
Um I'm a error theorist when it comes to
>> this is all trivial then nothing okay
nothing has normative value to you
because you're normative nihilist
>> yeah they're all false
>> wait okay so you realize you realize you
realize then what okay so you realize
you should have bit every single bullet
when v when vpm asked you oh yeah is it
morally permissible to like slaughter
another human being that that should
have been an immediate yes from you is
it morally permissible to throw a baby
off the cliff that's more that's a
be amoral under your view because all of
Like it's just trivially follow.
>> I can still have like attitudes towards
normative things.
>> Yeah, I can. Yeah, I don't care. Wait,
why should anyone say ontologically
they're false? I mean,
>> why should anyone care about your
attitudes, Felix?
>> I'm going to unmute you both again and
then I'm I'm going to unmute you both
again and then I'm going to leave the
channel. This is not a [ __ ]
philosophy channel. This is a channel
where we talk about veganism. So if you
want to talk philosophy, [ __ ] off
somewhere else to do it.
>> Yeah. I'll just I'll just add right to
that. Um this is about applied ethics,
right? And veganism is an applied
ethics. We're not talking about like
meta or um anything like that, right?
>> By the way, applied ethics is like so
philosophical. Yeah,
>> we don't want to talk about philosophy.
That's I can like stick to normative
stuff. I was just making
>> um obviously VPM. My my issue with you
isn't with the veganism. I think the
fact that you say that like um like I'm
non-scentient entities don't have moral
worth. Um seems to be quite ridiculous
because like I value things like I
morally value things like nature, right?
I morally value things like Mount Everest.
Everest.
>> You didn't give the reference to us.
>> Well, that's not right. I didn't say
that you can't like morally value like
nature or whatever that means. I'm just
saying you value it I'm just saying you
value it for the sentient beings in
nature not for the nature itself as a
concept. If you do it's kind of cringe position.
position.
>> Um did didn't you say that like things
without sensients don't have moral
worth? Didn't
>> No, they don't have moral worth to
themselves to the thing itself. They
have obviously moral world worth to like
the sentient beings, right? My phone has
moral worth because I own it. I need it.
So you can't smash my phone. But a phone
in a desert that nobody comes back for.
Nobody claims uh like ownership of that
phone. That phone is not morally worthy
at all. And you can smash it. Do
whatever the [ __ ] you want.
>> Distinction. There's going to be a
distinction between like a phone and
between like Mount Everest, right? Um, I
don't even think the distinction I think
the distinction is once man-made. I
don't even think is like how much we
value it because I think things I think
that these things would actually like be
objectively um like they would like be
like like objectively um like there
would still be like some type of moral
obligation to them. Um if like if like
even though human exists, I think like
the other animals would have like
obligations um towards like things like
nature too. I mean, I don't see why you
might like find like Mount Everest like
pretty or something or like some kind of
>> Yeah. Like it's not only
>> that's not like a moral consideration.
That's like aesthetic consideration.
>> Well, wait. Yeah. But that's not like
the only reason, right? Cuz I I like
>> What would be the moral consideration of
Mount Everest?
>> Like Yeah. Like like it provides like
nature for other entities. It provides home.
home.
>> Yeah, exactly. It provides home to
sentient beings. So it's worth for
sentient beings. That's exactly my point
though. Mount Everest. Wait, hold on.
Hold on. Hold on, just let me explain.
Mount Everest on its own is not morally
worthy. It's only morally worthy because
other sentient beings live in it and
it's important to those sentient beings.
That's exactly my point. So you can't
value Mount Everest in a vacuum, right?
Do you
>> Yeah. Well,
>> do you value Mount? Right. It it
>> Do you value No, please. Do you value
Mount Everest in a vacuum?
>> Uh yeah, I would still value it in a vacuum.
vacuum. >> Why?
>> Why?
>> Um because the moral fact is going to be there.
there.
>> Why? Right. I think I think
>> moral fact.
>> Yeah. Yeah. Yeah. It's just like if
there exists a moral fact, it's going to
be there like objectively every time.
That that's why it's so going to be
there, right? But obviously the
>> Wait, wait, wait. No, no, no. I don't
understand why actually. What are the
traits of Mount Everest that you would
like actually value in a vacuum like
it's it's devoid? No sentient beings.
>> I would value Yeah. I I would value the
moral fact that we ought take care of of
nature. I would value
>> No, no. I'm asking you specifically
about Mount Everest.
>> That's I'm telling you the part that I
value, right? Obviously, like having
moral worth is going to be a property.
It's going to be a predicate.
>> What What are the traits of Mount
Everest that you would value in a [ __ ]
[ __ ]
>> I just told you the trade. I just I
literally told you the trade.
>> The more like the moral fact that we
ought take care of nature. Mount Everest
would still be a part of nature.
>> Why do Why? No, hold on. That's not in a
[ __ ] vacuum. I'm asking you to about
Everest in a vacuum.
>> Yeah. in a vacuum, it would still be a
part of some nature, right? The set of nature.
nature.
>> No, it's literally not because it's in a
[ __ ] vacuum.
>> It literally would.
>> No, I'm literally saying Mount Everest,
the the dirt
>> without any Hold on. Hold on. Without
any sentient beings in it and without
the nature part, Mount Everest in a
vacuum. Why would you value that?
>> Yeah, I told you. Because of the moral
fact that we ought take care of Mount
Everest because we ought take care of
nature, right? Nature isn't only animals.
animals.
>> That's it. Sounds like
>> why should we accept that moral? Why
should we accept that normative uh >> um
>> um
>> proposition? Well, well, I don't I don't
I don't know why you like accept it
normatively, right?
>> Yeah, but you said it was like a
normative fact, right? So,
>> um yeah, I mean in in this possible,
right? In in this possible, I believe we
can only have like normative facts when
it comes to sentience, right? And like
moral facts are going to be distinct
from those. There you go. We're talking
about normative. We don't give a [ __ ]
about meta. Let's let's let's move on to
let's move on to veganism. I don't care
halfway. Let's move on to veganism. I
don't mean that's not good faith at all.
Like can we please be kind?
>> No no no no.
>> I think I think Norman fact I think
Norman facts have like reasons behind
them. But I think moral facts mean like
they don't have reasons behind them,
right? Um because because again the way
I kind of hold that first we have first
we have the moral fact and then from
that we give reasons um to like why we
hold the moral fact because I I think we
know the moral facts by our perceptions
about the moral facts. It just sounds
like, oh, I like Mount Everest, so let's
like [ __ ] value it.
>> But you're presupp I think you're
begging the question though, like the moral.
moral.
Click on any text or timestamp to jump to that moment in the video
Share:
Most transcripts ready in under 5 seconds
One-Click Copy125+ LanguagesSearch ContentJump to Timestamps
Paste YouTube URL
Enter any YouTube video link to get the full transcript
Transcript Extraction Form
Most transcripts ready in under 5 seconds
Get Our Chrome Extension
Get transcripts instantly without leaving YouTube. Install our Chrome extension for one-click access to any video's transcript directly on the watch page.