top | item 41526687

(no title)

wesleyyue | 1 year ago

> Maybe something's timing out with the longer o1 response times?

Let me look into this – one issue is that OpenAI doesn't expose a streaming endpoint via the API for o1 models. It's possible there's an HTTP timeout occurring in the stack. Thanks for the report

discuss

order