Let's see... the linked arXiv article has been withdrawn by the author with the following comment:
> Contains inappropriately sourced conjecture of OpenAI's ChatGPT parameter count from this http URL, a citation which was omitted. The authors do not have direct knowledge or verification of this information, and relied solely on this article, which may lead to public confusion
orbital-decay|2 years ago
> Contains inappropriately sourced conjecture of OpenAI's ChatGPT parameter count from this http URL, a citation which was omitted. The authors do not have direct knowledge or verification of this information, and relied solely on this article, which may lead to public confusion
The URL in question: https://www.forbes.com/sites/forbestechcouncil/2023/02/17/is...
This article was written by Aleks Farseev, the CEO of SoMonitor.ai, who makes the claim with no source or explanation:
> ChatGPT is not just smaller (20 billion vs. 175 billion parameters) and therefore faster than GPT-3
moffkalast|2 years ago