You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
refactor: move enable_structured_output_with_tools to LitellmModel
Moved the enable_structured_output_with_tools parameter from the Agent class to
LitellmModel.__init__() to minimize the diff and isolate changes within the LiteLLM
adapter as requested during code review.
Changes:
- Added enable_structured_output_with_tools parameter to LitellmModel.__init__()
- Stored as instance variable and used throughout LitellmModel
- Removed parameter from Agent class and related validation
- Removed parameter from Model interface (get_response / stream_response)
- Removed parameter from Runner (no longer passed to model calls)
- Removed parameter from OpenAI model implementations
- Reverted test mock models to original signatures
- Updated test_gemini_local.py for model-level configuration
- Updated documentation to reflect model-level usage
Before:
Agent(model=..., enable_structured_output_with_tools=True)
After:
Agent(model=LitellmModel(..., enable_structured_output_with_tools=True))
Copy file name to clipboardExpand all lines: docs/agents.md
+5-3Lines changed: 5 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -81,14 +81,16 @@ from agents.extensions.models.litellm_model import LitellmModel
81
81
82
82
agent = Agent(
83
83
name="Weather assistant",
84
-
model=LitellmModel("gemini/gemini-1.5-flash"),
84
+
model=LitellmModel(
85
+
"gemini/gemini-2.5-flash",
86
+
enable_structured_output_with_tools=True, # Required for Gemini
87
+
),
85
88
tools=[get_weather],
86
89
output_type=WeatherReport,
87
-
enable_structured_output_with_tools=True, # Required for Gemini
88
90
)
89
91
```
90
92
91
-
The `enable_structured_output_with_tools` parameter injects JSON formatting instructions into the system prompt as a workaround. This is only needed for models accessed via [`LitellmModel`][agents.extensions.models.litellm_model.LitellmModel]that lack native support. OpenAI models ignore this parameter.
93
+
The `enable_structured_output_with_tools` parameter on [`LitellmModel`][agents.extensions.models.litellm_model.LitellmModel]injects JSON formatting instructions into the system prompt as a workaround. This is only needed for models that lack native support for using tools and structured outputs simultaneously (like Gemini).
92
94
93
95
See the [prompt injection documentation](models/structured_output_with_tools.md) for more details.
enable_structured_output_with_tools=True, # Required for Gemini
117
+
),
115
118
tools=[analyze_data],
116
119
output_type=Report,
117
-
enable_structured_output_with_tools=True, # Required for Gemini
118
120
)
119
121
```
120
122
121
-
The `enable_structured_output_with_tools` parameter enables a workaround that injects JSON formatting instructions into the system prompt instead of using the native API. This allows models like Gemini to return structured outputs even when using tools.
123
+
The `enable_structured_output_with_tools` parameter on `LitellmModel`enables a workaround that injects JSON formatting instructions into the system prompt instead of using the native API. This allows models like Gemini to return structured outputs even when using tools.
122
124
123
125
See the [prompt injection documentation](structured_output_with_tools.md) for complete details.
0 commit comments