DevOps AI Toolkit Headlamp Plugin
AI-powered Kubernetes operations directly inside Headlamp — query, remediate, operate, and recommend with natural language.
What is the Headlamp Plugin?
The DevOps AI Toolkit Headlamp Plugin brings AI-powered Kubernetes operations into the Headlamp dashboard. Instead of switching between separate UIs, you get Query, Remediate, Operate, Recommend, and Knowledge Base search directly inside your existing Kubernetes dashboard.
The plugin communicates with the dot-ai MCP server running in your cluster via Headlamp's Kubernetes API proxy. Authentication is handled by Kubernetes RBAC using your existing cluster credentials.
Features
Query
Ask natural language questions about your cluster. Results render as interactive diagrams (Mermaid.js), data tables, cards, and syntax-highlighted code blocks.
Remediate
Diagnose issues on any Kubernetes resource. Get root cause analysis and remediation options. Available as a standalone page and injected into every resource detail page via Headlamp's registerDetailsViewSection().
Operate
Perform Day 2 operations (scale, update, rollback, delete) via natural language. The plugin shows proposed changes with dry-run validation before execution. Available as a standalone page and on resource detail pages.
Recommend
Multi-step wizard for deployment recommendations. Describe what you want to deploy, select from scored solutions, answer configuration questions, preview generated YAML manifests, and deploy directly to your cluster.
Knowledge Base Search
Search your organization's knowledge base from the Headlamp app bar. Get AI-synthesized answers with markdown rendering, source references, and collapsible content chunks.
Quick Start
Recommended: For the easiest setup, install the complete dot-ai stack which includes all components (MCP server, Web UI, and Controller). See the Stack Installation Guide.
Prerequisites
- Headlamp >= 0.22 (desktop, in-cluster, web, or Docker Desktop)
- dot-ai MCP server running in your cluster
Install from Plugin Catalog
- Open Headlamp
- Go to Settings > Plugin Catalog
- Search for dot-ai
- Click Install
The plugin is also available on ArtifactHub.
Configure
After installing, go to Settings > Plugins > dot-ai and configure:
| Setting | Default | Description |
|---|---|---|
| Service name | dot-ai | Kubernetes Service name for the dot-ai MCP server |
| Namespace | dot-ai | Namespace where the Service lives |
| Port | 3456 | Service port |
| Token | (empty) | Optional bearer token for X-Dot-AI-Authorization header |
No token needed for most setups — authentication is handled by Kubernetes RBAC through Headlamp's API proxy.
How It Works
Headlamp Browser
└── dot-ai plugin
└── ApiProxy.request()
└── Kubernetes API Server (RBAC)
└── dot-ai Service (in-cluster)
All requests flow through Headlamp's ApiProxy.request(), which proxies to the Kubernetes API server. The API server then forwards to the dot-ai Service using the standard Kubernetes service proxy (/api/v1/namespaces/{ns}/services/{name}:{port}/proxy/). Your existing cluster credentials handle authentication — no separate token management needed.
The plugin uses a custom X-Dot-AI-Authorization header for optional bearer token auth, since Headlamp's backend overwrites the standard Authorization header with the Kubernetes token.
Compatibility
The plugin works with all Headlamp distributions:
| Distribution | Support |
|---|---|
| In-cluster | Yes |
| Web | Yes |
| Docker Desktop | Yes |
| Desktop app | Yes |
Support
- GitHub Issues: Bug reports and feature requests
Related Projects
- AI Engine — DevOps AI Toolkit MCP server
- Web UI — Standalone web UI (alternative frontend)
- CLI — CLI for terminal-based interaction
- Controller — Kubernetes controller for autonomous operations
DevOps AI Toolkit Headlamp Plugin — AI-powered Kubernetes operations inside Headlamp.