(no title)
afastow | 2 years ago
Obviously they're a black box so it's possible there could be some very rare edge cases where it happens anyway, but it'd be a complete fluke. Changing the prompt even superficially would essentially cause a butterfly effect in the model that would prevent it from going down the exact same path and making the same mistake again.
No comments yet.