-
Notifications
You must be signed in to change notification settings - Fork 5
Execution Flow from Coordinator #26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
| | Provider | Models | Notes | | ||
| |----------|--------|-------| | ||
| | **Gemini** | `gemini-2.0-flash-exp` (default)<br>`gemini-1.5-pro`<br>`gemini-1.5-flash` | Remote, requires API key | | ||
| | **Ollama** | `llama3.2` (default)<br>`llama3.1`<br>`codellama`<br>`mistral` | Local, free | |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I changed this bc Ruff linter was complaining about it
| console.print(f"[dim]Using Gemini model: {model}[/dim]") | ||
|
|
||
|
|
||
| def _print_tools_info(tools_description: str, verbose: bool = False) -> None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I moved the print functions to a util file to clean up the logic in this file
| coord = CoordinationService( | ||
| registry_path=registry_path, | ||
| openai_model=openai_model, # for Coordinator to chain subprompts | ||
| step_executor=detected_model, # "ollama" | "gemini" | "openai" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is so the executor could use a variation of LLM Providers, but right now implementation is to require OpenAi Responses for chaining logic.
| typer.echo(plan_obj.model_dump_json(indent=2)) | ||
| # Print trace of tool calls | ||
| trace = result.get("trace") or [] | ||
| if trace: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I should prob move this out of this file this is just printing tracing
| """ | ||
| Produce JSON args for one tool call. | ||
| Make compatible with other LLM Providers (needs testing) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
note comment here - this logic is implemented but I did not test it
| res = self.stdio.request("initialize", params, timeout=timeout) | ||
|
|
||
| # Initialize before sending request | ||
| self.notify("notifications/initialized") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
need this handshake
| tools_txt = "\n".join(f"- {name}: {desc}" for name, desc in tool_summaries) | ||
|
|
||
| instructions = PLANNING_PROMPT_FULL | ||
| # This is using OpenAI *only* for planning - can be updated to use other LLM provider |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note this comment - we should have access to other providers but only using openai right now.
| _DOLLAR_KEY_RE = re.compile( | ||
| r"\$([A-Za-z][A-Za-z0-9_]*)\$?" | ||
| ) # tolerated: $KEY or $KEY$ (could happen in LLM provider mistakes) | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I added more logic and therefore expanded definition of 'sanitization' here
these methods are for data validation and normalization. Some of the normalization is specific to the flows I have been testing
| ] | ||
| } | ||
| }, | ||
| "email-operations": { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
adding this back in as default just for simplicity's sake when a user whats to try out the 'hello world' of ShardGuard when copy/pasting the prompt. Without this the user has to 'add-mcp' before trying out the sample prompt
| "args": [ | ||
| "-m", | ||
| "shardguard.mcp_servers.web_server" | ||
| [ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixing issue reading this
No description provided.