Skip to content
Discussion options

You must be logged in to vote

Hi, the current state of LLM clients means you will likely need to write an enum wrapper:

enum Provider {
    OpenAI(rig::provider::openai::Client),
    // .. the rest of your providers here
}

We're taking a stab at dealing with this exact issue in #440 but for now, this is probably the closest thing you're going to get to a solution.

Replies: 2 comments 1 reply

Comment options

You must be logged in to vote
0 replies
Answer selected by ifeue
Comment options

You must be logged in to vote
1 reply
@joshua-mo-143
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants