top | item 35446691

(no title)

stewfortier | 2 years ago

Unfortunately, everything but the AI works offline. Though, maybe that's a feature if you're planning a more mellow retreat :)

discuss

order

sasas|2 years ago

Have you considered a limited LLM that could run locally?

> planning a more mellow retreat

The objective here is to forcefully going to where internet is impossible (no phone reception, I don’t have starlink) with the objective of focused productive output with limited distractions.

The idea came to mind after reading about John Carmack doing this for a week, diving into AI using nothing but classic text books and papers as reference material to work off.

EDIT: here is the HN thread on Carmack’s week long retreat:

https://news.ycombinator.com/item?id=16518726

capableweb|2 years ago

> Have you considered a limited LLM that could run locally?

I think there are two main issues here. LLM are large (the name even hints at it ;) ) and the smaller ones (still, multiple GB) are really, really bad.

Edit: and uses a ton of memory, either RAM if CPU or VRAM if GPU.