(no title)
ceras | 3 years ago
Even if you find a "pure" utilitarian (which is rare), it's considered "naive utilitarianism" to ignore the long-term and broad effects of creating harm, and it's also considered intellectually arrogant to think you know enough about the impacts of your decisions and moral philosophy that you can justify causing definite short-term harm for potential long-term gain against the moral frameworks of just about everyone else.
On the whole, I can't personally think of any one person involved in EA, or widely-read EA literature, that promote or support that type of thinking.
No comments yet.