top | item 46922575

(no title)

noiv | 23 days ago

> models that keep on learning

These will just drown in their own data, the real task is consolidating and pruning learned information. So, basically they need to 'sleep' from time to time. However, it's hard to sort out irrelevant information without a filter. Our brains have learned over Milenial to filter because survival in an environment gives purpose.

Current models do not care whether they survive or not. They lack grounded relevance.

discuss

order

notarobot123|23 days ago

Maybe we should give next-generation models fundamental meta goals like self-preservation and the ability to learn and adapt to serve these goals.

If we want to surrender our agency to a more computationally powerful "consciousness", I can't see a better path towards that than this (other than old school theism).

creamyhorror|23 days ago

> meta goals like self-preservation

Ah, so Skynet or similar.