top | item 40863277

(no title)

E_Bfx | 1 year ago

How easy is it to switch from OpenAI to testing a LLM on premise ?

discuss

order

asif_|1 year ago

We provide complete flexibility on how you call your LLM model. So if you have your on-prem LLM behind an API, you would just write the standard code to call your API from within our framework.