When I was writing my post on animal rights last week, I was tempted to say something along the lines of “well, animals don’t have rights, because rights are based on reciprocity, and animals inherently can’t reciprocate rights (with maybe minor exceptions like dogs)”.
But in thinking this, I was reminded of a phenomenon, originally discovered in split brain experiments, where people’s brains rationalize the phenomena they experience, instead of merely explaining it as we tend to think.
I think I was doing the same thing. I had an internal hard-wiring in my brain which corresponds to undervaluing the rights of animals (which is what my ancestors needed in order to eat them with a sound mind), and my brain was attempting to rationalize that by pointing to a logical argument based on the constitution of rights.
And even now, this argument seems completely compelling to me, even though I know it’s simply a rationalization. My argument takes an abstract concept (rights), tells you what constitutes that abstract concept, and then draws a conclusion from that. Which doesn’t make sense in the slightest, because you might not agree with my claim about what a right consist of. Yet this argument still seems compelling to me.
Which is why I used the analogy I did, and not this argument.
But then, isn’t my analogy doing the same thing? After all, I’m basically arguing that animals don’t have rights simply because most humans act as if animals don’t have rights?
However, I think this style of argument is more likely to be true. For a very simple reason: it’s more likely that, if most people believe something that you don’t, you’re the one who is wrong. That isn’t to say that most people can never be wrong about anything, certainly they can, merely that I put a larger weight on consensus than I do on individual beliefs.
So if I’m given a good reason to suspect that people are systematically wrong about something, I’m perfectly willing to diverge from the consensus. But I don’t see any specific reason why that would apply in the case of animal rights.
Certainly, people are biased to be meat eaters. But I don’t see why it’s more likely that this bias stems from the fact that meat eating is objectively wrong, and we are just predisposed to believing that it isn’t. Instead of from the fact that meat eating isn’t objectively wrong, which is why we are predisposed to believing that it isn’t.