top | item 43346184

(no title)

unsui | 11 months ago

I've called this out numerous times (and gotten downvoted regularly), with what I call the "Cult of Optimization"

aka optimization-for-its-own-sake, aka pathological optimization.

It's basically meatspace internalizing and adopting the paperclip problem as a "good thing" to pursue, screw externalities and consequences.

And, lo-and-behold, my read for why it gets downvoted here is that a lot of folks on HN ascribe to this mentality, as it is part of the HN ethos to optimize , often pathologically.

discuss

order

jmount|11 months ago

Love your point. "Lack of alignment" affects more than just AIs.