diff --git a/docs/guides/ollama-guide.mdx b/docs/guides/ollama-guide.mdx
index 200ebbdf31..3a2dc515aa 100644
--- a/docs/guides/ollama-guide.mdx
+++ b/docs/guides/ollama-guide.mdx
@@ -168,7 +168,7 @@ models:
### Model Capabilities and Tool Support
-Some Ollama models support tools (function calling) which is required for Agent mode. However, not all models that claim tool support work correctly:
+Some Ollama models support tools (function calling) which is required for Agent mode and MCP integration. However, not all models that claim tool support work correctly:
#### Checking Tool Support
@@ -181,6 +181,10 @@ models:
- tool_use # Add this to enable tools
```
+
+ **MCP Tool Calling Compatibility**: Continue automatically handles message normalization for Ollama models to ensure compatibility with MCP tool calling. This includes fixing known issues with Mistral and Gemma models. No additional configuration is required.
+
+
**Known Issue**: Some models like DeepSeek R1 may show "Agent mode is not
supported" or "does not support tools" even with capabilities configured. This
@@ -307,7 +311,19 @@ ollama pull deepseek-r1:32b
1. Add `capabilities: [tool_use]` to your model config
2. If still not working, the model may not actually support tools
-3. Switch to a model with confirmed tool support (Llama 3.1, Mistral)
+3. Switch to a model with confirmed tool support (Llama 3.1, Mistral, DeepSeek, Qwen)
+
+#### MCP Tool Calling Errors
+
+**Problem**: Errors like "Unexpected role 'system' after role 'tool'" (Mistral) or "Invalid 'tool_calls': unknown variant 'index'" (Gemma)
+
+**Solution**: These errors are automatically handled by Continue's message normalization system. If you encounter them:
+
+1. Ensure you're using Continue v1.1.x or later
+2. The normalization happens automatically - no configuration needed
+3. For persistent issues, see the [troubleshooting guide](/troubleshooting#ollama-model-errors-with-mcp-tool-calling)
+
+**Recommended models for MCP tool calling**: DeepSeek V3, Qwen3 family, Llama 3.1, Mistral (all versions)
#### Using Hub Blocks in Local Config
@@ -370,6 +386,17 @@ Use Continue with Ollama to:
- Identify potential bugs
- Generate documentation
+## Using Ollama with MCP Tools
+
+Ollama models can be used with MCP (Model Context Protocol) servers for enhanced functionality. When using MCP tools:
+
+- **Ensure tool support**: Add `capabilities: [tool_use]` to your model configuration
+- **Choose compatible models**: DeepSeek V3, Qwen3 family, Llama 3.1, and Mistral models work well with MCP tools
+- **Automatic normalization**: Continue automatically handles model-specific message formatting to ensure compatibility
+- **Error handling**: If you encounter tool calling errors, check the [troubleshooting guide](/troubleshooting#ollama-model-errors-with-mcp-tool-calling)
+
+For more information on MCP integration, see the [MCP guides](/guides/overview#mcp-integration-cookbooks).
+
## Conclusion
Ollama with Continue provides a powerful local development environment for AI-assisted coding. You now have complete control over your AI models, ensuring privacy and enabling offline development workflows.
diff --git a/docs/troubleshooting.mdx b/docs/troubleshooting.mdx
index 4e2bc9fd3d..fb530a4141 100644
--- a/docs/troubleshooting.mdx
+++ b/docs/troubleshooting.mdx
@@ -82,6 +82,42 @@ If your keyboard shortcuts are not resolving, you may have other commands that a
## MCP Server connection issues
+### Ollama model errors with MCP tool calling
+
+Certain Ollama models may encounter errors during MCP tool calling operations, particularly after tool execution when the model processes tool results.
+
+
+
+ **Error message:**
+ ```
+ 400 Bad Request: Unexpected role 'system' after role 'tool'
+ ```
+
+ **Cause:** Mistral family models don't accept system messages appearing after tool messages in the conversation.
+
+ **Solution:** This issue is automatically handled by Continue's message normalization (added in v1.1.x). If you're still experiencing this error:
+ - Ensure you're using the latest version of Continue
+ - The normalization automatically reorders system messages before tool interactions
+ - No configuration changes are required
+
+
+
+ **Error message:**
+ ```
+ 400 Bad Request: Invalid 'tool_calls': unknown variant 'index'
+ ```
+
+ **Cause:** Gemma models don't recognize the 'index' field in tool_calls structure.
+
+ **Solution:** This issue is automatically handled by Continue's message normalization (added in v1.1.x). If you're still experiencing this error:
+ - Ensure you're using the latest version of Continue
+ - The normalization automatically removes the 'index' field from tool calls
+ - No configuration changes are required
+
+ **Note:** Some Gemma models may still experience compatibility issues with MCP tool calling even after normalization. Consider using alternative models like DeepSeek, Qwen, or Mistral for reliable MCP tool support.
+
+
+
### "spawn ENAMETOOLONG" error on macOS
If you're seeing an error like `Failed to connect to ""` with `Error: spawn ENAMETOOLONG` when using MCP servers on macOS, this is due to the environment being too large when spawning the MCP process.