Menu
Grafana Cloud
LLM plugin
Note
LLM app plugin is currently in public preview. Grafana Labs offers limited support, and breaking changes might occur prior to the feature being made generally available.
Grafana Cloud offers a range of optional features that leverage Large Language Model (LLM) services. These features are not enabled by default, but you can easily activate them in the Grafana LLM app plugin by approving limited data sharing with the OpenAI API.
The Grafana LLM app centralizes access to Large Language Model (LLM) services across Grafana to secure and simplify your LLM interactions.
The Grafana LLM application plugin serves several key functions:
- Acts as a proxy, handling authenticated requests to LLMs. This eliminates the requirement for other Grafana components to manage API keys.
- Enables real-time streaming interactions on the Grafana front end by offering live streams of responses from the LLM provider.
If you prefer, you may also configure your own API authentication from supported LLM providers, including OpenAI and Azure. With this option, the LLM app securely stores API keys for you.
What can it do?
Unlock the potential of Grafana LLM plugin with features like:
- AI-powered flamegraph interpretation
- Incident Auto-summary
- Dashboard panel title and description generation
- Explanations of error log lines in Sift
Was this page helpful?
Related resources from Grafana Labs
Additional helpful documentation, links, and articles:
Video
Getting started with the Grafana LGTM Stack
In this webinar, we’ll demo how to get started using the LGTM Stack: Loki for logs, Grafana for visualization, Tempo for traces, and Mimir for metrics.
Video
Intro to Kubernetes monitoring in Grafana Cloud
In this webinar you’ll learn how Grafana offers developers and SREs a simple and quick-to-value solution for monitoring their Kubernetes infrastructure.
Video
Building advanced Grafana dashboards
In this webinar, we’ll demo how to build and format Grafana dashboards.