top | item 43331353 (no title) AutoAPI | 11 months ago An option to use a local LLM on network without needing to download the 2GB "default model" would be great discuss order hn newest yohannparis|11 months ago It's in the README https://github.com/johnbean393/Sidekick?tab=readme-ov-file#f... AutoAPI|11 months ago It still forces you to download a model regardless load replies (1)
yohannparis|11 months ago It's in the README https://github.com/johnbean393/Sidekick?tab=readme-ov-file#f... AutoAPI|11 months ago It still forces you to download a model regardless load replies (1)
yohannparis|11 months ago
AutoAPI|11 months ago