so, this idea looks like follows: expose programmatic access to your program, which potentially operates in destructive manner (no Undo button) on potentially sensitive data; give a sloppy LLM (sloppy - due to its sheer unpredictability and ability to fuck up things a sober human with common sense never ever would) a Python interpreter; then let it run away with it and hope that your boundaries are enough to stop it at the edges YET don't limit the user too much?nah, I'm skipping this update.
No comments yet.