(no title)
thecal
|
2 years ago
Think of the model as a gigantic compiled binary where you send in strings in a certain format and get back a response. This is a web API wrapper for that so you only need an HTTP client instead of having to run something like llama.cpp yourself.
No comments yet.