(no title)
a13n
|
18 days ago
This example feels more like a bug in the law itself that should be corrected. If this behavior is acceptable then it should be legal so we can avoid everyone the hassle in the first place. I bet AI would be great at finding and fixing these bugs.
chmod775|18 days ago
Codifying what is morally acceptable into definitive rules has been something humanity has struggled with for likely much longer than written memory. Also while you're out there "fixing bugs" - millions of them and one-by-one - people are affected by them.
> I bet AI would be great at finding and fixing these bugs.
Ae we really going to outsource morality to an unfeeling machine that is trained to behave like an exclusive club of people want it to?
If that was one's goal, that's one way to stealthily nudge and undermine a democracy I suppose.
ohyoutravel|18 days ago
AuryGlenz|18 days ago
But, again, who is going to decide to put forward a bill to change that? It's all risk and no reward for the politician.
Spooky23|18 days ago
fendy3002|18 days ago
The state of current AI does not give them ability to know that, so the consideration is likely to be dropped
whattheheckheck|17 days ago
quantified|18 days ago
Finding the bugs- will be entertaining.
s1artibartfast|18 days ago
simondotau|18 days ago
One might imagine a distant future where laws could be dramatically simplified into plain-spoken declarations, to be interpreted by a very advanced (and ideally true open source) future LLM. So instead of 18 U.S.C. §§ 2251–2260 the law could be as straightforward as:
"In order to protect children from sexual exploitation and eliminate all incentive for it, no child may be used, depicted, or represented for sexual arousal or gratification. Responsibility extends to those who create, assist, enable, profit from, or access such material for sexual purposes. Sanctions must be proportionate to culpability and sufficient to deter comparable conduct."
...and the AI will fill in the gaps.
salawat|16 days ago
No. No, thank you.