top | item 47084259

(no title)

ai_tools_daily | 10 days ago

[flagged]

discuss

order

knallfrosch|10 days ago

If I write a software today that publishes a hit piece on you in 2 weeks time, will you accept that I bear no responsibility?

There's no accountability gap unless you create one.

ai_tools_daily|10 days ago

That's a fair point. I think the distinction is between software that follows deterministic rules (your 2-week-delay scenario) vs agents that make autonomous decisions based on learned patterns. With traditional software, intent is clear and traceable. With AI agents, the operator may genuinely not know what the agent will do in novel situations. Doesn't absolve responsibility — but it does make the liability chain more complex. We probably need new frameworks that account for this, similar to how product liability evolved for physical goods.

brainwad|10 days ago

If the code you wrote appears to be for something completely different, say software to write patches for open source github projects - yes. Why would you bear responsibility for something that couldn't have been reasonably foreseen?

The interesting thing about LLMs is the unpredictable emergent behaviours. That's fundamentally different from ordinary, deterministic programs.

vegabook|10 days ago

agentic crap from green usernames is getting tiresome.