A demonstration of StackQL's Model Context Protocol (MCP) integration with OpenAI, providing a natural language chat interface for querying and analyzing cloud infrastructure.
This demo showcases how StackQL's MCP server can be integrated with AI language models (like OpenAI's GPT-4o mini) to enable natural language querying of cloud infrastructure. Users can ask questions in plain English about their cloud resources, and the AI agent will:
- Understand the intent
- Discover available resources using StackQL's MCP tools
- Construct appropriate StackQL queries
- Execute queries and retrieve results
- Present findings in a clear, conversational format
- Natural Language Interface: Ask questions about your cloud infrastructure in plain English
- Multi-Provider Support: Query resources across Google Cloud, AWS, Azure, GitHub, Okta, and more
- Real-time Insights: Get immediate answers about your cloud estate
- AI-Powered Analysis: Leverage GPT-4o mini for intelligent query construction and result interpretation
- Web-Based UI: Clean, intuitive Streamlit interface
- Easy Deployment: Docker Compose setup for quick starts
┌─────────────────┐
│ User Query │
│ (Natural Lang) │
└────────┬────────┘
│
▼
┌─────────────────────────┐
│ Streamlit Chat UI │
│ (app.py) │
└──────────┬──────────────┘
│
▼
┌──────────────────────────┐
│ OpenAI GPT-4o mini │
│ (Function Calling) │
└──────────┬───────────────┘
│
▼
┌──────────────────────────┐
│ StackQL MCP Client │
│ (stackql_mcp_client.py) │
└──────────┬───────────────┘
│
▼
┌──────────────────────────┐
│ StackQL MCP Server │
│ (HTTP Protocol) │
└──────────┬───────────────┘
│
▼
┌──────────────────────────┐
│ Cloud Providers │
│ (GCP, AWS, Azure, etc.) │
└──────────────────────────┘
- OpenAI API Key: Required for the AI agent (Get one here)
- StackQL: Either installed locally or using Docker
- Python 3.11+: For running the chat interface
- Cloud Provider Credentials: For the providers you want to query
- Docker and Docker Compose installed
- StackQL installed (Installation guide)
- Python 3.11+ with pip
git clone https://github.com/yourusername/stackql-cloud-intel-demo.git
cd stackql-cloud-intel-demoCopy the example environment file and edit it with your credentials:
cp .env.example .envEdit .env and add your OpenAI API key:
OPENAI_API_KEY=sk-your-openai-api-key-here
OPENAI_MODEL=gpt-4o-mini
STACKQL_MCP_URL=http://127.0.0.1:9912Add cloud provider credentials as needed:
# Google Cloud
GOOGLE_CREDENTIALS=/path/to/google-credentials.json
# AWS
AWS_ACCESS_KEY_ID=your_aws_access_key
AWS_SECRET_ACCESS_KEY=your_aws_secret_key
# Azure
AZURE_TENANT_ID=your_tenant_id
AZURE_CLIENT_ID=your_client_id
AZURE_CLIENT_SECRET=your_client_secret
# GitHub
GITHUB_TOKEN=your_github_tokendocker-compose up -dThis starts both the StackQL MCP server and the chat interface.
./start.shThis script will:
- Validate your configuration
- Start the StackQL MCP server
- Create a Python virtual environment
- Install dependencies
- Launch the Streamlit chat interface
- Start StackQL MCP Server:
export GOOGLE_CREDENTIALS=$(cat /path/to/google-credentials.json)
stackql mcp \
--mcp.server.type=http \
--mcp.config '{"server": {"transport": "http", "address": "127.0.0.1:9912"}}' \
--auth='{"google": {"type": "service_account", "credentialsfilepath": "/path/to/google-credentials.json"}}'- In another terminal, start the chat interface:
python3 -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -r requirements.txt
streamlit run app.pyOpen your browser and navigate to:
http://localhost:8501
You: What cloud providers are available?
AI: I can help you query the following cloud providers:
- Google Cloud (google)
- Amazon Web Services (aws)
- Microsoft Azure (azure)
- GitHub (github)
- Okta (okta)
...and many more!
You: Show me all my Google Cloud compute instances
AI: Let me query your GCP instances...
[Executes: SELECT * FROM google.compute.instances WHERE project = 'your-project']
Found 5 instances in your project:
1. instance-1 (n1-standard-2, running, us-central1-a)
2. instance-2 (n1-standard-4, running, us-east1-b)
...
You: How many compute instances do I have across all providers?
AI: Let me check your compute resources across providers...
Summary:
- Google Cloud: 5 instances
- AWS: 3 instances
- Azure: 2 instances
Total: 10 compute instances
See examples.md for more query examples.
stackql-cloud-intel-demo/
├── app.py # Streamlit chat interface
├── openai_stackql_agent.py # OpenAI integration with function calling
├── stackql_mcp_client.py # StackQL MCP client library
├── requirements.txt # Python dependencies
├── .env.example # Environment variable template
├── .gitignore # Git ignore rules
├── start.sh # Startup script
├── Dockerfile # Docker image definition
├── docker-compose.yml # Docker Compose configuration
├── examples.md # Example queries and use cases
└── README.md # This file
The StackQL MCP server exposes the following tools to the AI agent:
| Tool | Description | Parameters |
|---|---|---|
greet |
Test connection | name (string) |
list_providers |
List all cloud providers | None |
list_services |
List services in a provider | provider (string) |
list_resources |
List resources in a service | provider, service |
list_methods |
List methods for a resource | provider, service, resource |
query_v2 |
Execute StackQL query | sql (string) |
| Variable | Description | Default | Required |
|---|---|---|---|
OPENAI_API_KEY |
OpenAI API key | - | Yes |
OPENAI_MODEL |
OpenAI model to use | gpt-4o-mini |
No |
STACKQL_MCP_URL |
StackQL MCP server URL | http://127.0.0.1:9912 |
No |
GOOGLE_CREDENTIALS |
Path to GCP credentials | - | No |
AWS_ACCESS_KEY_ID |
AWS access key | - | No |
AWS_SECRET_ACCESS_KEY |
AWS secret key | - | No |
AZURE_TENANT_ID |
Azure tenant ID | - | No |
GITHUB_TOKEN |
GitHub personal access token | - | No |
StackQL can be deployed in three different MCP server modes:
-
Standalone HTTP Server (used in this demo)
stackql mcp --mcp.server.type=http --mcp.config '{"server": {"transport": "http", "address": "127.0.0.1:9912"}}' -
MCP + PostgreSQL (In-Memory)
stackql srv --mcp.server.type=http --mcp.config '{"server": {"transport": "http", "address": "127.0.0.1:9912"}}' --pgsrv.port 5665 -
MCP + PostgreSQL (Reverse Proxy)
stackql srv --mcp.server.type=reverse_proxy --mcp.config '{"server": {"transport": "http", "address": "127.0.0.1:9004"}, "backend": {"dsn": "postgres://stackql:[email protected]:5446"}}' --pgsrv.port 5446
Problem: Can't connect to StackQL MCP server
Solutions:
- Verify the server is running:
curl http://localhost:9912 - Check the logs:
tail -f stackql-mcp.log - Ensure port 9912 is not blocked by firewall
- Verify
STACKQL_MCP_URLin.envis correct
Problem: Queries fail with authentication errors
Solutions:
- Verify cloud provider credentials are set correctly
- For GCP: Ensure
GOOGLE_CREDENTIALSpoints to a valid service account key - For AWS: Check that
AWS_ACCESS_KEY_IDandAWS_SECRET_ACCESS_KEYare correct - Test authentication with StackQL CLI:
stackql exec "SELECT * FROM google.compute.instances"
Problem: AI responses are slow or fail
Solutions:
- Verify
OPENAI_API_KEYis valid - Check OpenAI API status: https://status.openai.com
- Review rate limits on your OpenAI account
- Try a different model by changing
OPENAI_MODELin.env
# Test the MCP client
python stackql_mcp_client.py
# Test the OpenAI agent
python openai_stackql_agent.py- New MCP Tools: Edit
stackql_mcp_client.pyto add new tool methods - OpenAI Functions: Update
TOOLSarray inopenai_stackql_agent.py - UI Enhancements: Modify
app.pyto add new UI components
- API Keys: Never commit
.envfile or credentials to version control - Cloud Credentials: Use least-privilege service accounts
- Network: Run StackQL MCP server on localhost only, unless properly secured
- TLS: For production, use TLS encryption (see StackQL MCP docs)
- Use specific queries rather than
SELECT *for better performance - Filter results with WHERE clauses when possible
- For large result sets, consider pagination
- Cache frequently accessed data when appropriate
Contributions are welcome! Please feel free to submit a Pull Request.
- StackQL Documentation
- StackQL MCP Server Guide
- Model Context Protocol Specification
- OpenAI Function Calling Guide
Apache License 2.0
- GitHub Issues: Report a bug
- StackQL Discord: Join the community
- Documentation: StackQL Docs
Made with ☁️ by the StackQL Community