use different LLMs #458
-
|
I'm writing App which provides the ability that different user can use different LLM (different provider, different model, etc.). User can change LLM model from a configuration file. Please advise how can I handle the different LLMs with same code. Any pattern I can use in Rig? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
|
Hi, the current state of LLM clients means you will likely need to write an enum wrapper: enum Provider {
OpenAI(rig::provider::openai::Client),
// .. the rest of your providers here
}We're taking a stab at dealing with this exact issue in #440 but for now, this is probably the closest thing you're going to get to a solution. |
Beta Was this translation helpful? Give feedback.
-
|
I'm struggling with the exact same issue and discovered that in most cases CompletionModel is a type that shares a trait Completion model, but there is an issue with Openai for example, because for some weird reason it has its own ResponsesCompletionModel and shares no common trait. There are probably more such issues which means, the creation of an agent requires knowing the type at compile time and recompiling the whole application for specific providers. |
Beta Was this translation helpful? Give feedback.
Hi, the current state of LLM clients means you will likely need to write an enum wrapper:
We're taking a stab at dealing with this exact issue in #440 but for now, this is probably the closest thing you're going to get to a solution.