Skip to content

Conversation

@annmalavet
Copy link
Collaborator

No description provided.

| Provider | Models | Notes |
|----------|--------|-------|
| **Gemini** | `gemini-2.0-flash-exp` (default)<br>`gemini-1.5-pro`<br>`gemini-1.5-flash` | Remote, requires API key |
| **Ollama** | `llama3.2` (default)<br>`llama3.1`<br>`codellama`<br>`mistral` | Local, free |
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I changed this bc Ruff linter was complaining about it

console.print(f"[dim]Using Gemini model: {model}[/dim]")


def _print_tools_info(tools_description: str, verbose: bool = False) -> None:
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I moved the print functions to a util file to clean up the logic in this file

coord = CoordinationService(
registry_path=registry_path,
openai_model=openai_model, # for Coordinator to chain subprompts
step_executor=detected_model, # "ollama" | "gemini" | "openai"
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is so the executor could use a variation of LLM Providers, but right now implementation is to require OpenAi Responses for chaining logic.

typer.echo(plan_obj.model_dump_json(indent=2))
# Print trace of tool calls
trace = result.get("trace") or []
if trace:
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I should prob move this out of this file this is just printing tracing

"""
Produce JSON args for one tool call.
Make compatible with other LLM Providers (needs testing)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

note comment here - this logic is implemented but I did not test it

res = self.stdio.request("initialize", params, timeout=timeout)

# Initialize before sending request
self.notify("notifications/initialized")
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

need this handshake

tools_txt = "\n".join(f"- {name}: {desc}" for name, desc in tool_summaries)

instructions = PLANNING_PROMPT_FULL
# This is using OpenAI *only* for planning - can be updated to use other LLM provider
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note this comment - we should have access to other providers but only using openai right now.

_DOLLAR_KEY_RE = re.compile(
r"\$([A-Za-z][A-Za-z0-9_]*)\$?"
) # tolerated: $KEY or $KEY$ (could happen in LLM provider mistakes)

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added more logic and therefore expanded definition of 'sanitization' here
these methods are for data validation and normalization. Some of the normalization is specific to the flows I have been testing

]
}
},
"email-operations": {
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

adding this back in as default just for simplicity's sake when a user whats to try out the 'hello world' of ShardGuard when copy/pasting the prompt. Without this the user has to 'add-mcp' before trying out the sample prompt

"args": [
"-m",
"shardguard.mcp_servers.web_server"
[
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixing issue reading this

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant