Using OpenInference and OpenTelemetry to send logs and traces from your LlamaIndex RAG app to Signoz
First, install all the necessary dependencies for the backend:
Optional Create Python virtual env:
python -m venv myenv && \
source myenv/bin/activateThen:
pip install -r requirements.txtInstall all the necessary dependencies for the frontend:
cd rag-frontend && \
npm installNext create a .env file with the following(in root directory):
OPENAI_API_KEY=<your-openai-api-key>
SIGNOZ_INGESTION_KEY=<your-signoz-ingestion-key>Run the backend:
uvicorn main:app --reloadWait for the docs to be fully ingested and for the application startup to complete:

Run the frontend:
cd rag-frontend && \
npm startOpen http://localhost:3000 with your browser to see the result and interact with the application.
After using the application, you should be able to view traces and logs in your SigNoz Cloud platform: