Snowflake data source for Grafana
The Snowflake data source plugin allows you to query and visualize Snowflake data metrics from within Grafana.
Requirements
This plugin has the following requirements:
- A Snowflake user with the appropriate role granted.
- This data source does not require a specific role.
- The Snowflake user’s role is what allows that user to access tables. In order to query your data, ensure your user has the appropriate roles.
- One of the following account types:
- Available for users with a Grafana Cloud Free, Advanced or Trial account or with an activated Grafana Enterprise license.
Install the Snowflake data source plugin
To install the data source, refer to Installation
Configure Snowflake
Configuring the Snowflake data source requires a Snowflake user with a username and a password.
Grafana recommends creating a new user with limited permissions for this data source.
Create a user
In order to connect to Snowflake, you must create a user or authenticate using an existing one. This user will run all queries sent from Grafana.
If you would like to have separate users run different queries / workloads, then you should create multiple Snowflake data sources with different settings.
To create a user in Snowflake, you will have to log in to your Snowflake instance and run the CREATE USER command.
Grant a role
Now that the Snowflake user is created, the user must be granted a role using the GRANT ROLE command. Granting a role to a user allows the user to perform operations allowed by that role.
This role is what defines what warehouses and tables the user has access to.
Configure the data source in Grafana
These connection settings are the same that are used in connecting via SnowSQL
Add a data source by filling in the following fields:
Field | Description |
---|---|
Name | A name for this particular Snowflake data source |
Account | Account is the name of the Snowflake account assigned by Snowflake. In the URL received from Snowflake after the account was provisioned, the account name is the entire string to the left of snowflakecomputing.com . If the Snowflake instance is not on us-west-2 , then the region must be included in the account name. Example: xyz123.us-east-1 If the Snowflake instance is not on Amazon Web Services , then the platform must also be included in the account name. Example: xyz123.us-east-1.gcp |
Region | Deprecated in favor of Account. Region specifies the region where the Snowflake instance lives |
Username | The username of the account that will query Snowflake |
Authentication Type | Authentication type. Either you can use password authentication or Key Pair Authentication |
Password | The password of the account that will query Snowflake |
Private Key | Enter your unencrypted private key in this field if you prefer to use Key pair based authentication. |
Role | This option allows users to connect to the Snowflake instance using a role that is not the default for the user. The role must still be granted to the user using the GRANT ROLE command in order for it to be assumed. |
Warehouse | The warehouse to use by default for queries |
Database | The database to use by default for queries |
Schema | The schema to use by default for queries |
TimeInterval | Optional. The lower limit for the $__interval and $__interval_ms macros. If left blank, 10s will be used as default |
Default Query | Optional. Default query to be used when adding a new snowflake query to the panel |
Default Variable Query | Optional. Default query to be used when adding a new snowflake query to the dashboard variable |
Key Pair authentication
For enhanced security, Key Pair authentication can be utilized as an alternative to basic authentication. You can generate the public and private keys by following the referenced Snowflake documentation. When using Key Pair authentication, it is important to update the rsa_public_key
in Snowflake and provide the username and unencrypted private key in the data source configuration.
OAuth authentication
You can use OAuth authentication to pass through tokens to Snowflake on behalf of the user logged into Grafana. Below provides some instructions on using Azure AD as the OAuth provider.
Use Azure AD to setup OAuth
Follow instructions here to update the application you created in step 1, and add a client application for Snowflake.
Update the scopes you created in step 1 in your grafana.ini file. Add the api you created in step 2. scopes should look something like:
scopes = api://8c1a0b1c-6bb0-4190-a730-8a1c34237619/session:role-any openid email profile offline_access
- Restart Grafana and Login with Azure AD. Create a Snowflake Datasource using Authentication Type: OAuth and toggle Forward OAuth Identity. Save and test to confirm the token is being passed through and is valid.
What if I get an invalid token error? Step 2 instructions provide ways to validate the token which will provide additional information on why it is invalid.
select system$verify_external_oauth_token('<ACCESS_TOKEN>');
Additional Links:
Azure OAuth. Follow the “on behalf of user” Flow: https://docs.snowflake.com/en/user-guide/oauth-azure Testing with Postman: https://community.snowflake.com/s/article/How-To-Configure-Postman-for-testing-SQL-API-with-OAuth Setup a Snowflake security integration: https://community.snowflake.com/s/article/Create-Security-Integration-User-To-Use-With-OAuth-Client-Token-With-Azure-AD
Configure the data source with provisioning
It is possible to configure data sources using config files with Grafana’s provisioning system. You can read more about how it works and all the settings you can set for data sources on the provisioning docs page
Example:
datasources:
- name: Snowflake
type: grafana-snowflake-datasource
access: proxy
basicAuth: false
editable: true
enabled: true
jsonData:
account: xyz123.east-us-2.azure
username: grafana-user
authType: password
timeInterval: 10s
defaultQuery: SELECT \n\t $__timeGroup(<time_column>, $__interval) as time,\n\t <value_column>\n FROM <metric_table>\n WHERE $__timeFilter(time)
defaultVariableQuery: SELECT DISTINCT <column_name> FROM <metric_table> LIMIT 1000
defaultInterpolation: '' # Refer: https://grafana.com/docs/grafana/latest/variables/advanced-variable-format-options/
secureJsonData:
password: grafana-password
- name: Snowflake Billing Data
type: grafana-snowflake-datasource
access: proxy
basicAuth: false
editable: true
enabled: true
jsonData:
account: xyz123.us-east1.gcp
username: grafana-admin-user
database: snowflake
role: ACCOUNTADMIN
timeInterval: 10s
defaultQuery: SELECT \n\t $__timeGroup(<time_column>, $__interval) as time,\n\t <value_column>\n FROM <metric_table>\n WHERE $__timeFilter(time)
defaultVariableQuery: SELECT DISTINCT <column_name> FROM <metric_table> LIMIT 1000
defaultInterpolation: sqlstring # Refer: https://grafana.com/docs/grafana/latest/variables/advanced-variable-format-options/
secureJsonData:
password: grafana-admin-password
Query the data source
The provided query editor is a standard SQL query editor. Grafana includes some macros to help with writing more complex timeseries queries.
Macros
Macro | Description | Output example |
---|---|---|
$__timeFilter(column) | $__timeFilter filters the column by the panel time range. column must have fields without timezones. | CONVERT_TIMEZONE('UTC', 'UTC', time) < '2017-07-18T11:15:52Z' AND CONVERT_TIMEZONE('UTC', 'UTC', time) > '2017-07-18T11:15:52Z |
$__timeFilter(column, timezone) | $__timeFilter filters the column by the panel time range and converts from UTC to timezone . column must have fields without timezones. | CONVERT_TIMEZONE('UTC', 'America/New_York', time) < '2017-07-18T11:15:52Z' AND CONVERT_TIMEZONE('UTC', 'America/New_York', time) > '2017-07-18T11:15:52Z |
$__timeTzFilter(column) | $__timeTzFilter filters the column by the panel time range. column should have fields that include timezones. | CONVERT_TIMEZONE('UTC', time) < '2017-07-18T11:15:52Z' AND CONVERT_TIMEZONE('UTC', time) > '2017-07-18T11:15:52Z |
$__timeTzFilter(column, timezone) | $__timeTzFilter filters the column by the panel time range and converts the current timezone to timezone . column should have fields that include timezones. | CONVERT_TIMEZONE('America/New_York', time) < '2017-07-18T11:15:52Z' AND CONVERT_TIMEZONE('America/New_York', time) > '2017-07-18T11:15:52Z |
$__timeGroup(column, $__interval) | $__timeGroup groups timestamps by the interval so that there is only 1 point for every $__interval on the graph | TIME_SLICE(TO_TIMESTAMP(created_ts), 1, 'HOUR', 'START') |
$__timeGroup(column, $__interval, timezone) | $__timeGroup groups timestamps by the interval so that there is only 1 point for every $__interval on the graph and converts to given timezone | TIME_SLICE(TO_TIMESTAMP(CONVERT_TIMEZONE('UTC', 'America/Los_Angeles', created_ts)), 1, 'HOUR', 'START') |
Examples
Table Visualization
Most queries in Snowflake will be best represented by a table visualization. Any query will display data in a table. If it can be queried, then it can be put in a table.
This example returns results for a table visualization:
SELECT {column_1}, {column_2} FROM {table};
Timeseries / Graph visualizations
For timeseries / graph visualizations, there are a few requirements:
- A column with a
date
ordatetime
type must be selected - The
date
column must be in ascending order (usingORDER BY column ASC
) - A numeric column must also be selected
To make a more reasonable graph, be sure to use the $__timeFilter
and $__timeGroup
macros.
Example timeseries query:
SELECT
avg(execution_time) AS average_execution_time,
$__timeGroup(start_time, $__interval),
query_type
FROM
account_usage.query_history
WHERE
$__timeFilter(start_time)
group by
query_type,start_time
order by
start_time,query_type ASC;
Inspecting the query
Because Grafana supports macros that Snowflake does not, the fully rendered query, which can be copy/pasted directly into Snowflake, is visible in the Query Inspector. To view the full interpolated query, click the Query Inspector button, and the full query will be visible under the “Query” tab.
Templates and variables
To add a new Snowflake query variable, refer to Add a query variable. Use your Snowflake data source as your data source for the following available queries:
Any value queried from a Snowflake table can be used as a variable. Be sure to avoid selecting too many values, as this can cause performance issues.
If the variable query returns two columns, the values from the second column will be used as display values
Using variables
After creating a variable, you can use it in your Snowflake queries by using Variable syntax. For more information about variables, refer to Templates and variables.
While using the variables in your query, optionally you can set the interpolation format. You can also configure the default interpolation format in the datasource config.
Single-value variables
If a variable returns a single value, you can use one of the following formats. The below example assumes, you have set sqlstring
as default interpolation and you have two variables called queryTypeSingle
and limit
. They have values SELECT
and 2
respectively.
SELECT query_type FROM account_usage.query_history WHERE query_type = ${queryTypeSingle} Limit ${limit:raw}
will be translated into
SELECT query_type FROM account_usage.query_history WHERE query_type = 'SELECT' Limit 10
Setting default variable interpolation type is introduced in version 1.2. From this version,you can set the default interpolation type to
sqlstring
Prior to 1.2 version of the plugin or if you use none
as the default interpolation type, the same query have to be written as the following
SELECT query_type FROM account_usage.query_history WHERE query_type = '${queryTypeSingle}' Limit ${limit:raw}
Multi-value variables using sqlstring interpolation
When consuming a variable which return multiple options, you can consume it in the following method. The below example assumes, you have set sqlstring
as default interpolation and you have variables called queryTypeMulti
and limit
. They have values CREATE,SELECT
and 2
respectively.
SELECT query_type FROM account_usage.query_history WHERE query_type in (${queryTypeMulti}) Limit ${limit:raw}
will be translated into
SELECT query_type FROM account_usage.query_history WHERE query_type in ('CREATE','SELECT') Limit 10
Multi-value variables using regex
To use a variable that has multiple values, you can use the regex modifier option and the regexp
Snowflake function. E.g. ${variable:regex}
For example, this query will filter using only the Query Types
selected in the queryType
variable:
...
AND query_type regexp '${queryType:regex}'
...
This query translates to:
...
AND query_type regexp '(DESCRIBE|CREATE_USER|DROP|TRUNCATE_TABLE|ALTER)'
...
Visualizing data as Logs
With the Logs format selected in the query, you can visualize the data in Logs viewer in explorer. When querying with Logs format, your query should have at least one time column and one string/content column. Optionally, query can have third column called level to set the log level of the particular row. Supported log level and their keywords can be found in grafana logs integration docs site. If the query returns any additional columns, they will be treated as additional fields/ detected fields in the logs.
For example, following is the valid log query.
SELECT 'hello foo' as "content", (timestamp '2021-12-31') as "start_time", 'warn' as "level" UNION
SELECT 'hello bar' as "content", (timestamp '2021-12-30 14:12:59') as "start_time", 'error' as "level" UNION
SELECT 'hello baz' as "content", (timestamp '2021-12-30') as "start_time", 'warn' as "level" UNION
SELECT 'hello qux' as "content", (timestamp '2021-12-29') as "start_time", 'info' as "level" UNION
SELECT 'hello world' as "content", (timestamp '2021-12-28') as "start_time", 'unknown' as "level" UNION
SELECT 'hello user' as "content", (timestamp '2021-12-27') as "start_time", 'info' as "level"
Import a dashboard for Snowflake
The bundled Snowflake dashboard requires a data source with an ACCOUNTADMIN
role for viewing billing data. There is an example of this in the provisioning section.
Follow these instructions for importing a dashboard.
Imported dashboards can be found in Configuration > Data Sources > select your Snowflake data source > select the Dashboards tab to see available pre-made dashboards.
Get the most out of the plugin
- Add Annotations.
- Configure and use Templates and variables.
- Add Transformations.
- Use Logs in explorer.
- Set up alerting; refer to Alerts overview.