(no title)
badsectoracula | 6 days ago
When i saw this i expected something more... integrated, but when i tried it with a local LLM (using koboldcpp) after enabling the option to show localhost as an option (it is hidden by default for some reason) all it did was to local whatever webpage was running on the localhost URL (even though koboldcpp also provides an OpenAI compatible endpoint, which is what i expected Firefox to use to provide its own UI). It seems to have some sort of heuristic to find the input box where you type in queries and autofills that with the page text or parts of it (if you have it selected) and that's all.
I kinda expected it instead to use the API endpoint, have its own chat UI, provide MCP tools for accessing and manipulating the page's content, let you create reusable prompts, etc. The current solution feels like something you'd throw together in a weekend at most.
skywhopper|6 days ago
badsectoracula|5 days ago