top | item 40946538

(no title)

nperez | 1 year ago

Maybe an unpopular opinion, but it sounds like they're doing this the right way.

I want to be able to run a local model using something like VLLM or FastChat, then be able to call it from a context menu. No obnoxious toolbar taking up the UI like Edge, just access to use the tools I'm running when I need them.

That's what this appears to be - not a case of a specific AI service being shoved in anyone's face.

discuss

order

PaulKeeble|1 year ago

AI models however are pretty big so its going to make the browser installation a lot bigger even if its only pulled down when you first go to use the feature.