Not that I'm aware of, but I've ran LLM's in Termux myself and am pretty it's not a huge leap from there to a simple python scaffold with a restricted grammer (the kind llama supports, for consistant structured output) and a Tasker integration. Hardest part would be trigger-word activation (maybe hardware key-bind is enough fr u?). It would probably run well locally with the all the cumulative speedups over the past year, but I do bet it would drain your battery more than Google Assistant, and am not sure that moving it over the network would change that.
smusamashah|1 year ago
It may not be as good as ChatGPT but will be good enough for things we want android assistant to do.
mdrzn|1 year ago