(no title)
djrhails | 4 years ago
The drunk driving parallel I thought was more that a little amusing. A more apt comparison might just be driving itself - it's a risky activity both for yourself and the others around you. Perhaps even to the speculated 1% marker but that doesn't appear to create any sort of moral imperative to change our willingness to drive.
Equally I couldn't help notice the level of subtle anthropomorphism going on - name dropping depression as related to anhedonia is a clever literary technique for encouraging this natural thought pattern; even while adding the "perhaps", "some think" it makes no difference to the "bias ... easily distorting our intuitions".
I'd love to see a similar essay for plants - or even the AI that they mention. I could see the outline for an equally persuasive essay (a shame that the author sees AI being trivially less likely to be sentient - one I have trouble believing beyond a "biological bias")
yboris|4 years ago
Your say:
> A more apt comparison ... doesn't appear to create any sort of moral imperative to change our willingness to drive.
How convenient. You choose something that we can't live in our modern world without (driving a car) and say "see, no need to change our behavior".
Meanwhile, farming bugs for food is not at all essential in the current world - we can easily feed everyone without subjecting ourselves to the possibility of doing something morally horrible.
Here's another parallel, you decide to start building sheds and burning them down. You now realize there is a 1% chance that the current shed has a child who decided to hide in it; do you burn it down because "1% is a low chance"? Of course not. And that's the point the author is making - when there is even a low percent chance of something very bad happening, and you're not doing anything that is essential or necessary, you ought not do it.
xyzzy123|4 years ago
The author is subtly invoking a utilitarian trick where you multiply a tiny number by a very large number and arrive at a nonsensical result.
So for example, the tiny harm of killing 1 insect times trillions of insects = unspeakable abomination.
If we follow this a bit further we can reasonably conclude that one of the most important moral problems for the human race to address is insect welfare and life extension.
> when there is even a low percent chance of something very bad happening, and you're not doing anything that is essential or necessary, you ought not do it.
This position is called https://en.wikipedia.org/wiki/Negative_utilitarianism and it's not one that makes sense for me, personally.
OzyM|4 years ago
It seems sensible to say - these creature with nocicepters and familiar aperture for feeling pain, who react to to bodily damage in ways that suggest that they can suffer, and for whom pain can obviously serve the same evolutionary benefit as it does in humans - they have a moderate to high chance of feeling pain and being conscious to some degree, so we should be careful.
Plants may release chemicals in response to physical threats, but they don't have the pieces of the nervous system we attribute pain to, don't seem to have any level of consciousness, and don't have an evolutionary benefit to subjective suffering. Therefore, morally, there's no reason to treat them as more than an inanimate object.
I feel like AI could theoretically someday have the potential to suffer, but that isn't really a current concern.
Based on all available evidence, the article's argument for at least being careful about insects' potential suffering seems sensible, but the plant argument strikes me as absurd.
limbicsystem|4 years ago