top | item 41361404

(no title)

mjsweet | 1 year ago

Could this be related to their plans for local inference? Maybe the need to load larger models into memory has driven their decision?

discuss

order

bearjaws|1 year ago

Has to be, even a small model is 4gb of RAM. Open up any O365 based product in Chrome and kiss another 2gb goodbye if you only have 8gb you're left with very little after the OS...

surfingdino|1 year ago

So no actual increase in available RAM and no improvement in performance then. That'll be an extra £999, sir.