A Grafana plugin for monitoring and visualizing Databricks Jobs, DLTs and Pipelines directly in your Grafana dashboards.
- Job Runs: Visualize your Databricks job executions, including run status, duration, and other metrics
- Pipelines: Query data about your Databricks Delta Live Tables pipelines
- Filtering Options: Flexible filtering by job ID, run type, and execution status
Please see the CHANGELOG for the latest updates and changes.
Depending on your configuration you can either download the pre-packaged zip file or the source code and build it yourself (see instructions below).
Easiest way to use the plugin is to use the latest release.
If your Grafana instance is configured to only allow signed plugins to run, you will need to sign the plugin yourself (see instructions below).
-
Download the latest release zip file from the releases page.
-
Unzip the file into
distdirectory:unzip databricks-grafana-datasource-x.y.z.zip -d dist
-
Create Access Policy Token in your Grafana Cloud instance (Security > Access Policies). This token will be used to sign the plugin.
-
Export the token to your environment:
export GRAFANA_ACCESS_POLICY_TOKEN=glc... -
Sign the plugin, replacing the root URL with your Grafana instance URL:
npx @grafana/sign-plugin@latest --rootUrls http://localhost:3000/
-
Re-package the plugin into a zip file (if needed):
mv dist databricks-grafana-datasource zip -r databricks-grafana-datasource.zip databricks-grafana-datasource
You can now install the signed plugin in your Grafana instance. To quickly validate the plugin, you can spin up a local Grafana instance using Docker (where $PWD/plugins is the directory containing the signed plugin databricks-grafana-datasource):
docker run --rm -p 3000:3000 \
-v "$PWD/plugins:/var/lib/grafana/plugins" \
-e GF_AUTH_DISABLE_LOGIN_FORM=true \
-e GF_AUTH_ANONYMOUS_ENABLED=true \
-e GF_AUTH_ANONYMOUS_ORG_ROLE=Admin \
grafana/grafana:latestIf everything is set up correctly, you should be able to see the "Databricks Community" plugin in the Grafana UI under Configuration > Data Sources > Add data source and msg="Plugin registered" pluginId=databricks-grafana-datasource message in the Grafana server logs.
- Relatively recent version LTS of Node.js
- Relatively recent version of Go installed and configured in your
PATH
This plugin needs to be built from the source and signed so you could use it in your hosted Grafana instance. Follow the steps below to build and install the plugin:
-
Clone the repository:
git clone [email protected]:databricks/databricks-grafana-datasource.git
-
Navigate to the plugin directory:
cd databricks-grafana-datasource -
Build the backend (here we're running custom mage target to skip building for Linux ARM):
mage buildAllNoLinuxArm
Your
distdirectory should now contain native binaries for various platforms. -
Build the frontend:
npm install npm run build
Your
distdirectory should now contain the built frontend files.
The plugin needs to be signed before it can be used in a Grafana Cloud instance. The signing process ensures that the plugin is verified and trusted by Grafana.
-
Create Access Policy Token in your Grafana Cloud instance (Security > Access Policies). This token will be used to sign the plugin.
-
Export the token to your environment:
export GRAFANA_ACCESS_POLICY_TOKEN=glc... -
Sign the plugin, replacing the root URL with your Grafana instance URL:
npx @grafana/sign-plugin@latest --rootUrls http://localhost:3000/
The plugin needs to be packaged into a zip file before it can be installed in Grafana.
-
Create a zip file of the plugin:
mv dist databricks-grafana-datasource zip -r databricks-grafana-datasource.zip databricks-grafana-datasource
You can now install the plugin in your Grafana instance.
- Grafana
10or later - Databricks workspace (any cloud provider)
- Databricks Service Principal with the following permissions:
- Permissions to list either all or specific jobs / pipelines, or:
- Admin permissions in the workspace
- In Grafana, navigate to
Configuration > Data Sources > Add data source - Search for "Databricks Community" and select this plugin
- Configure the following settings:
Workspace URL: Your Databricks workspace URL (e.g., https://adb-xxx.0.azuredatabricks.net)Client ID: Service Principal Client IDClient Secret: Service Principal Client Secret
- Click "Save & Test" to verify the connection
The query editor provides different options based on the selected resource type:
Job ID: Optional filter to show runs for a specific jobActive Only: Toggle to show only currently running jobsCompleted Only: Toggle to show only completed jobsRun Type: Filter by job run type (JOB_RUN, WORKFLOW_RUN, or SUBMIT_RUN)Max Results: Maximum number of results to return (default: 200)
Filter: Text filter for pipeline queriesMax Results: Maximum number of results to return (default: 200)
Please refer to the dashboards directory for example dashboards that demonstrate the capabilities of this plugin.
