top | item 39787836

(no title)

kken | 1 year ago

From what I gather, this project started out as an implementation of a code-interpreter using a local LLM. Basically your instructions are used to write code by the LLM, which is then executed. The idea is that it can be much more powerful having access to your native systems shell instead of only sandboxed python.

In the meantime, it seems that also models with vision capability have been added, that can be used to access GUI based applications, not only the shell.

It's a very exciting concept that lives in a space where open source software should have a significant advantage due to its transparency. (Or would you give a black box device access to everything on your computer?).

It also seems to one of several emerging projects that try to sketch out a path for ideas of how LLMs could change the way we interact with computers.

discuss

order

No comments yet.