Replies: 1 comment 3 replies
-
Oh, I found this: so it's probably fine. We just shouldn't use the builtin openai client, but raw requests. https://docs.litellm.ai/docs/completion/provider_specific_params#proxy-usage |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
When you use Proxy LLM, you can use
openai
default python client such as:however,
create
has a specific set of arguments (nokwargs
, for example), and therefore I don't see a way how we can pass provider specific parameters to some of the providers as one can do with barebonelitellm
python SDK, documented e.g. here.Am I missing something?
Beta Was this translation helpful? Give feedback.
All reactions