Skip to content

Conversation

@technovangelist
Copy link

I submitted an issue about Turbo being added as a hosted option for Ollama (#580). It turned out to be super easy to implement, just adding an option with a header. Works perfectly.

@technovangelist technovangelist changed the title Add Ollama Turbo support and update request handling in interpreter Add Ollama Turbo support for interpreter Aug 25, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant