First part is correct, the second part is not. GPT Neo is a 2.7B param model, the largest GPT is 175B (they have various flavours, up to 175B). I appreciate the sentiment and what ElutherAI is doing with GPT Neo, but there is no open source equivlenet of the full GPT-3 available for the public to use. Hopefuly it's just a matter of time.
dave_sullivan|4 years ago
GPT Neo was trained at similar expense, and they released the weights. Use that.
damvigilante|4 years ago
ritz_labringue|4 years ago