top | item 44804699 (no title) coltonv | 6 months ago What did you set the context window to? That's been my main issue with models on my macbook, you have to set the context window so short that they are way less useful than the hosted models. Is there something I'm misisng there? discuss order hn newest hrpnk|6 months ago With LM Studio you can configure context window freely. Max is 131072 for gpt-oss-20b. coltonv|6 months ago Yes but if I set it above ~16K on my 32gb laptop it just OOMs. Am I doing something wrong? load replies (1) simonw|6 months ago I punted it up to the maximum in LM Studio - seems to use about 16GB of RAM then, but I've not tried a long prompt yet.
hrpnk|6 months ago With LM Studio you can configure context window freely. Max is 131072 for gpt-oss-20b. coltonv|6 months ago Yes but if I set it above ~16K on my 32gb laptop it just OOMs. Am I doing something wrong? load replies (1)
coltonv|6 months ago Yes but if I set it above ~16K on my 32gb laptop it just OOMs. Am I doing something wrong? load replies (1)
simonw|6 months ago I punted it up to the maximum in LM Studio - seems to use about 16GB of RAM then, but I've not tried a long prompt yet.
hrpnk|6 months ago
coltonv|6 months ago
simonw|6 months ago