You can trivially enforce that at the AI provider level, which covers 99% of the problem the law is designed to address.
Of course it doesn't cover the issue of foreign state psyop operations but the fact that enforcing laws against organized crime and adversary state actors is hard isn't specific to AI.
Are you not aware of open-weights models and local generation? I think the vast majority of deepfake content is being genned in basements on RTX cards, not on public providers. People already have all this content, and have archives of it, and can run it airgapped. Cat is out of bag.
You don't have to prove anything? You just have to mark the outputs of your slop generator appropriately. "Proving" one way or another is their problem when it comes to enforcement.
littlestymaar|3 days ago
Of course it doesn't cover the issue of foreign state psyop operations but the fact that enforcing laws against organized crime and adversary state actors is hard isn't specific to AI.
mikestorrent|3 days ago
pegasus|3 days ago
unknown|3 days ago
[deleted]
parliament32|3 days ago
nightski|2 days ago