top | item 39011128

(no title)

hanselot | 2 years ago

Is it not a bit disingenuous to assume all open source AI proponents would readily back nuclear proliferation?

It's going to be hard to convince anyone if the best argument is terminator or infinite paperclips.

The first actual existential threat is destruction of opportunity specifically in the job market.

The same argument though can be made for the opposing side, where making use of ai can increase productivity and open up avenues of exploration that previously required way higher opportunity cost to get into.

I don't think Miss Davis is more likely an outcome than corps creating a legislative moat (as they have already proven they will do at every opprtunity).

The democratisation of ai is a philanthropic attempt to reduce the disparity between the 99 and 1 percent. At least it could be easily perceived that way.

That being said, keeping up with SOTA is currently still insanely hard. The number of papers dropping in the space is exponential year on year. So perhaps it would be worth to figure out how to use existing AI to fix some problems, like unreproducable results in academia that somehow pass peer review.

discuss

order

waffletower|2 years ago

Indeed, both sentient hunt and destroy (ala Terminator) and resource exhaustion (ala infinite paperclips) are extremely unlikely extinction events due to supply chain realities in physical space. LLMs have developed upon largely textual amalgams, they are orthogonal to physicality and would need arduous human support to bootstrap an imagined AGI predecessor into havig a plausible auto-generative physical industrial capability. The supply chain for current semi-conductor technology is insanely complex. Even if you confabulate (like a current generation LLM I may add) an AGI's instant ability to radically optimize supply chains for its host hardware, there will still be significant human dependency on physical materials. Robotics and machine printing/manufacturing simply are not any where near the level of generality required for physical self-replication. These fears of extinction, undoubtedly born of stark cinematic visualization, are decidedly irrational and are most likely deliberately chosen narratives of control.