top | item 39839450

(no title)

biddit | 1 year ago

> The larger context window (200k tokens vs ~16k)

Just to add some clarification - the newer GPT4 models from OpenAI have 128k context windows[1]. I regularly load in the entirety of my React/Django project, via Aider.

1. https://platform.openai.com/docs/models/gpt-4-and-gpt-4-turb...

discuss

order