Too soon to tell, Google is supposedly training a monster sized model for i/o, 30 trillion parameters on 4 v4 TPU pods.
Like any good arms race, each side will out do the others and themselves over time. At some point they will be sufficiently good for the majority of problems they are tasked with
verdverm|2 years ago
Like any good arms race, each side will out do the others and themselves over time. At some point they will be sufficiently good for the majority of problems they are tasked with