Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
31 changes: 29 additions & 2 deletions docs/guides/ollama-guide.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -168,7 +168,7 @@ models:

### Model Capabilities and Tool Support

Some Ollama models support tools (function calling) which is required for Agent mode. However, not all models that claim tool support work correctly:
Some Ollama models support tools (function calling) which is required for Agent mode and MCP integration. However, not all models that claim tool support work correctly:

#### Checking Tool Support

Expand All @@ -181,6 +181,10 @@ models:
- tool_use # Add this to enable tools
```

<Info>
**MCP Tool Calling Compatibility**: Continue automatically handles message normalization for Ollama models to ensure compatibility with MCP tool calling. This includes fixing known issues with Mistral and Gemma models. No additional configuration is required.
</Info>

<Warning>
**Known Issue**: Some models like DeepSeek R1 may show "Agent mode is not
supported" or "does not support tools" even with capabilities configured. This
Expand Down Expand Up @@ -307,7 +311,19 @@ ollama pull deepseek-r1:32b

1. Add `capabilities: [tool_use]` to your model config
2. If still not working, the model may not actually support tools
3. Switch to a model with confirmed tool support (Llama 3.1, Mistral)
3. Switch to a model with confirmed tool support (Llama 3.1, Mistral, DeepSeek, Qwen)

#### MCP Tool Calling Errors

**Problem**: Errors like "Unexpected role 'system' after role 'tool'" (Mistral) or "Invalid 'tool_calls': unknown variant 'index'" (Gemma)

**Solution**: These errors are automatically handled by Continue's message normalization system. If you encounter them:

1. Ensure you're using Continue v1.1.x or later
2. The normalization happens automatically - no configuration needed
3. For persistent issues, see the [troubleshooting guide](/troubleshooting#ollama-model-errors-with-mcp-tool-calling)

**Recommended models for MCP tool calling**: DeepSeek V3, Qwen3 family, Llama 3.1, Mistral (all versions)

#### Using Hub Blocks in Local Config

Expand Down Expand Up @@ -370,6 +386,17 @@ Use Continue with Ollama to:
- Identify potential bugs
- Generate documentation

## Using Ollama with MCP Tools

Ollama models can be used with MCP (Model Context Protocol) servers for enhanced functionality. When using MCP tools:

- **Ensure tool support**: Add `capabilities: [tool_use]` to your model configuration
- **Choose compatible models**: DeepSeek V3, Qwen3 family, Llama 3.1, and Mistral models work well with MCP tools
- **Automatic normalization**: Continue automatically handles model-specific message formatting to ensure compatibility
- **Error handling**: If you encounter tool calling errors, check the [troubleshooting guide](/troubleshooting#ollama-model-errors-with-mcp-tool-calling)

For more information on MCP integration, see the [MCP guides](/guides/overview#mcp-integration-cookbooks).

## Conclusion

Ollama with Continue provides a powerful local development environment for AI-assisted coding. You now have complete control over your AI models, ensuring privacy and enabling offline development workflows.
Expand Down
36 changes: 36 additions & 0 deletions docs/troubleshooting.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -82,6 +82,42 @@ If your keyboard shortcuts are not resolving, you may have other commands that a

## MCP Server connection issues

### Ollama model errors with MCP tool calling

Certain Ollama models may encounter errors during MCP tool calling operations, particularly after tool execution when the model processes tool results.

<AccordionGroup>
<Accordion title="Mistral/Ministral: 'Unexpected role system after role tool'">
**Error message:**
```
400 Bad Request: Unexpected role 'system' after role 'tool'
```

**Cause:** Mistral family models don't accept system messages appearing after tool messages in the conversation.

**Solution:** This issue is automatically handled by Continue's message normalization (added in v1.1.x). If you're still experiencing this error:
- Ensure you're using the latest version of Continue
- The normalization automatically reorders system messages before tool interactions
- No configuration changes are required
</Accordion>

<Accordion title="Gemma models: 'Invalid tool_calls: unknown variant index'">
**Error message:**
```
400 Bad Request: Invalid 'tool_calls': unknown variant 'index'
```

**Cause:** Gemma models don't recognize the 'index' field in tool_calls structure.

**Solution:** This issue is automatically handled by Continue's message normalization (added in v1.1.x). If you're still experiencing this error:
- Ensure you're using the latest version of Continue
- The normalization automatically removes the 'index' field from tool calls
- No configuration changes are required

**Note:** Some Gemma models may still experience compatibility issues with MCP tool calling even after normalization. Consider using alternative models like DeepSeek, Qwen, or Mistral for reliable MCP tool support.
</Accordion>
</AccordionGroup>

### "spawn ENAMETOOLONG" error on macOS

If you're seeing an error like `Failed to connect to "<MCP Server Name>"` with `Error: spawn ENAMETOOLONG` when using MCP servers on macOS, this is due to the environment being too large when spawning the MCP process.
Expand Down
Loading