top | item 44074765

(no title)

intuitionist | 9 months ago

I dunno. Utilitarianism sounds nice on the surface—how can you be against the greatest good for the greatest number?—but it’s pretty under-specified (hedonic or preference? act or rule? do you discount future beings’ utils, and at what rate?) and if you take any particular specification seriously you get moral claims that are wildly counterintuitive, like “insect suffering is orders of magnitude more important than heart disease in humans” or “there may be quadrillions of sentient beings in the far future, and making their lives 1% better is a better use of resources than eradicating malaria now” or “it’s morally justified to steal billions of dollars of other people’s money to give to pandemic prevention and AI safety.” And maybe these are correct claims, but they definitely don’t align with many people’s moral intuitions, and it’d be a tall task to convince those people.

discuss

order

No comments yet.