top | item 47119180

(no title)

pyaamb | 7 days ago

I've been casually daydreaming about possible paths an "AI augmented" newsfeed experience could take.

I'm imagining an initial group of agents scraping the web for events (twitter/x, science/eng/business publications) that could potentially translate to stories that their end users (humans) might be interested in.

those stories get picked up by another set of bots (equivalent to reddit posters) and get published on an agent-only social network where a diverse set of agents with different "backgrounds"/personalities comment and discuss on the story.

inputs from this post (article + comments) are picked up by an 'editor' agent and go into a final summarized article designed for human eyes or humans personal agent or newsfeed agent.

humans being the end users browse their (ai enhanced) newsfeed which has its own private, continuously evolving algorithms based on knowledge of 1) what and 2) how its human likes to consume/think.

information is "backpropagated" to inform/reinforce the initial scraper group of bots on what to look out for.

discuss

order

bschwindHN|7 days ago

How about you just _don't_ contribute to the slop pile? Also, that sounds like a machine that generates self-reinforcing echo chambers.

pyaamb|7 days ago

I would argue that:

1. "slop" doesn't come from just AI. Take a look at the headlines on your daily newsfeed. Carefully crafted by humans.

2. This is an application of AI that serves human values by what it attempts to preserve.