top | item 36334894 (no title) brolumir | 2 years ago Unfortunately, in practice that works only most of the time. At least in our experience (and the article says something similar) sometimes ChatGPT would return something completely different when JSON-formatted response would be expected. discuss order hn newest blamy|2 years ago I've been using the same prompts for months and have never seen this happen on 3.5-turbo let alone 4.https://gist.github.com/BLamy/244eec016beb9ad8ed48cf61fd2054... tornato7|2 years ago In my experience if you set the temperature to zero it works 99.9% of the time, and then you can just add retry logic for the remaining 0.1%
blamy|2 years ago I've been using the same prompts for months and have never seen this happen on 3.5-turbo let alone 4.https://gist.github.com/BLamy/244eec016beb9ad8ed48cf61fd2054...
tornato7|2 years ago In my experience if you set the temperature to zero it works 99.9% of the time, and then you can just add retry logic for the remaining 0.1%
blamy|2 years ago
https://gist.github.com/BLamy/244eec016beb9ad8ed48cf61fd2054...
tornato7|2 years ago