top | item 36335724 (no title) lt | 2 years ago they point openai.api_base to their server that implements the same API discuss order hn newest OkGoDoIt|2 years ago That’s clever. Do other LLM API’s do that? dygd|2 years ago Yesterday there was a "Launch HN" thread for credal.ai [0] and I noticed that they use the same openai.api_base trick [1].[0] https://news.ycombinator.com/item?id=36326525 [1] https://credalai.notion.site/Drop-In-APIs-3a45d32405c347e8bf... anonzzzies|2 years ago It would take you (or gpt) 3 seconds to write an openai compatible wrapper; the inference api is trivial for all LLMs. arbuge|2 years ago Ah, I missed that. Thanks.
OkGoDoIt|2 years ago That’s clever. Do other LLM API’s do that? dygd|2 years ago Yesterday there was a "Launch HN" thread for credal.ai [0] and I noticed that they use the same openai.api_base trick [1].[0] https://news.ycombinator.com/item?id=36326525 [1] https://credalai.notion.site/Drop-In-APIs-3a45d32405c347e8bf... anonzzzies|2 years ago It would take you (or gpt) 3 seconds to write an openai compatible wrapper; the inference api is trivial for all LLMs.
dygd|2 years ago Yesterday there was a "Launch HN" thread for credal.ai [0] and I noticed that they use the same openai.api_base trick [1].[0] https://news.ycombinator.com/item?id=36326525 [1] https://credalai.notion.site/Drop-In-APIs-3a45d32405c347e8bf...
anonzzzies|2 years ago It would take you (or gpt) 3 seconds to write an openai compatible wrapper; the inference api is trivial for all LLMs.
OkGoDoIt|2 years ago
dygd|2 years ago
[0] https://news.ycombinator.com/item?id=36326525 [1] https://credalai.notion.site/Drop-In-APIs-3a45d32405c347e8bf...
anonzzzies|2 years ago
arbuge|2 years ago