Monday, February 26, 2007

Sentience Sapience & Value

Some philosophers seem to argue from the position that sentience sapience requires that we treat an object as a moral equal. But does that make sense? I'm tempted to say yes (the relative path is always the hardest to think through) -- but what about the practical concerns? We know how sapient human beings think, for a large part -- other beings, though often predictable, are alien in their thoughts. (dogs)

Could a species possibly be without a moral sense yet sapient, or does morality follow directly from sapience?

UPDATE: The most obvious problem of all is the epistemological question: how do we really know whether any other beings are really sentient or sapient? This section in Wikipedia describes the philosophy and problems fairly well.

I've just learned that sentience and sapience are actually quite distinct concepts, with the former not implying self-awareness and the latter implying it. It's left me a bit confused. Regardless, Singer uses sentience: if they have nerve endings which transmit pain signals to the brain, then they are sentient (many animals -- how about starfish and sponges?). If they don't, then they don't suffer. If they do suffer, we need to minimize their suffering (to what degree?). Singer seems to ignore the relevance of "sapience" as I've described it, but I'm not so sure that that is a good path.

Anyway. I need to do some late-night grocery closes.

5 comments:

Simple Minded said...

Morality is more based in culture than in sentience. Culture, you could argue pretty easily, comes from sentience. Morals, or what makes a person moral, is largely determined by the group of people they come from. Cannibalism is an example of this. It fits in with a certain moral expectation of some cultures, yet not of others.

ADHR said...

There's a difference between morality and a particular moral code. Moral codes are instances of morality in cultures. But, that aside, I'm not sure about the connection of sentience to morality. I would've connected sentience to awareness and sapience to self-awareness before trying to draw a line to morality.

undergroundman said...

But, that aside, I'm not sure about the connection of sentience to morality.

I'm not so sure either. I'm not trying to make a case. I think the case is implicit in Singer's philosophy of vegetarianism. The main relevant difference between animals and vegetables is a degree of sentience or (as I consider a synonym) self-awareness. I didn't think it was necessary to connect those things because I thought they were implied. I see now that I'm wrong and sentience does not imply self-awareness. :p So I'll change that word to sapience. I made the common mistake of conflating the two.

I'm assuming that sapience potentially includes animals, as Singer seems to think it does.

Singer assumes that we must treat animals as if they were humans because animals have a capacity for pain, enjoyment, and happiness. There are a couple problems that I see: how do we know if they have the capacity for the pain, and how much this pain matters if the animal is not self-aware. It seems to me like sentience is an assumed part of Singer's ethics, but maybe I'm wrong.

Then there's the problem of whether sapience necessarily means that someone (something - another sapient species) must be treated as a moral equal.

Simple Minded said...

I'm not sure I understand the difference between morality and a particular moral code. Is there a particular philosophical definition that you're relying on to make this distinction? The definition I know of morality is a doctrine or system of morals.

ADHR said...

UGM: I figured you were just speculating; I was agreeing with you that the connection is obscure. Singer's usual strategy, from what I know of him, is to define sentience as "the ability to feel pain" (which is parochial already), and then claim that all animals are sentient, because their nervous systems are approximately like ours. I'm not sure what he'd say about things like lobsters, which have quite different neural structures. But, the relevance of ability to feel pain comes from the fact that he's a Benthamite utilitarian. He basically bridges the gap between sentience and morality by definition.

SM: It's the difference between more and less determinate sets of rules. I can come up with a set of rules that looks like this: "Don't murder. Don't steal. Be a good friend." (Just three to make the example easy.) I can also make the rules more determinate -- add more detail. "Don't kill people who have not offended your honour, for that would be murder. Don't take property that you have not legitimately purchased or taken in war, for that would be stealing. Defend the honour of those you love, for that is being a good friend."

The difference that I'm drawing, then, is between the former set -- the general moral principles -- and the latter -- the more detailed, culturally-rich moral code. (I didn't have any particular culture in mind, which is why the detail is actually a little thin on the ground. In any actual culture, the principles will get instantiated in great detail, even down to prohibited and required sets of speech acts.)