top | item 38643844

(no title)

detente18 | 2 years ago

LiteLLM proxy (100+ LLMs in OpenAI format) is exactly compatible with the OpenAI endpoint. Here's how to call it with the openai sdk:

``` import openai

client = openai.OpenAI( api_key="anything", # proxy key - if set base_url="http://0.0.0.0:8000" # proxy url )

# request sent to model set on litellm proxy,

response = client.chat.completions.create(model="gpt-3.5-turbo", messages = [ { "role": "user", "content": "this is a test request, write a short poem" } ])

print(response)

``` Docs - https://docs.litellm.ai/docs/proxy/quick_start

discuss

order

No comments yet.