OpenAI hasn't published any information about the size or hardware requirements for running ChatGPT. Reading between the lines, the default ChatGPT Turbo model seems to be significantly smaller than GPT-3 (it's a distilled model), but probably still heavier than the Alpaca and Llama 7B models people are running (very slowly) on their single GPU computers this week. You'd probably need multiple A100s to get comparable performance to the ChatGPT API.
noduerme|3 years ago
Tostino|3 years ago
brianjking|3 years ago