AI Chat requires a connected AI model. If you haven’t configured one yet, see AI Setup.
Opening the Chat
Click the chat icon in the platform header to open the AI Chat sidebar. The chat is available at both the Cluster and Workspace level — the agent automatically scopes its responses to the context you’re working in.What You Can Do
The AI agent understands natural language and can use its tools to interact with your cluster on your behalf. Common use cases include:- Investigating issues such as misconfigured resources, unreachable applications, or networking problems.
- Querying cluster resources like Pods, Deployments, Services, and ConfigMaps.
- Getting guidance on resource optimization, including node allocation and resource limits.
- Reviewing and writing Kubernetes manifests or Helm values.
- Troubleshooting errors interactively by providing additional context the agent can use.
Tools
The AI Chat uses the same tools configured in AI Setup. Depending on your configuration, the agent can access the following during a conversation:Built-in Tools
These are always available:- Kubernetes — The agent can query resources, read logs, describe Pods, inspect events, and retrieve manifests from your cluster. When you ask about a failing Deployment or a misconfigured Service, the agent fetches the relevant data in real time.
- Helm — The agent can list releases, inspect chart values, and check release history. Useful for questions about what’s currently deployed or reviewing a Helm upgrade.
GitHub
When GitHub is connected in AI Setup, the agent can access your linked repositories during chat. This enables:- Looking up source code, configurations, and CI/CD pipelines related to the resources you’re asking about.
- Cross-referencing cluster state with the code that produced it.