opsmate is a SRE AI assistant
Project description
Opsmate
Opsmate is an LLM-powered SRE copilot for understanding and solving production problems. By encoding expert troubleshooting patterns and operational knowledge, Opsmate lets users describe problem statements and intentions in natural language, eliminating the need to memorise complex command line or domain-specific tool syntax.
Opsmate can not only perform problem solving autonomously, but also allow human operators to provide feedback and take over the control when needed. It accelerates incident response, reduces mean time to repair (MTTR), and empowers teams to focus on solving problems rather than wrestling with tooling.
Table of Contents
- Features
- Installation
- Configuration
- Quick Start
- Advanced Usage
- Use Cases
- Integrations
- Documentation
- Contributing
- License
Features
- 🤖 Natural Language Interface: Run commands using natural language without remembering complex syntax
- 🔍 Advanced Reasoning: Troubleshoot and solve production issues with AI-powered reasoning
- 🔄 Multiple LLM Support: Out of box works for OpenAI, Anthropic, xAI. Easy to extend to other LLMs.
- 🛠️ Multiple Runtimes: Supports various execution environments such as Local, Docker, Kubernetes and remote VMs.
- 🔭 Modern Observability Tooling: Built-in support for Prometheus allows you to create time series dashboards with natural language, and more to come.
- 🧠 Knowledge Management: Ingest and use domain-specific knowledge
- 📈 Web UI & API: Access Opsmate through a web interface or API
- 🔌 Plugin System: Extend Opsmate with custom plugins
Installation
Choose your preferred installation method:
The recommended way of installing opsmate is using uv
:
# Using uvx
uv tool install -U opsmate
Other than that, you can also install opsmate using pip
, pipx
or docker
.
# Using pip
pip install -U opsmate
# Using pipx
pipx install opsmate
# or
pipx upgrade opsmate
# Using Docker
docker pull ghcr.io/opsmate-ai/opsmate:latest
alias opsmate="docker run -it --rm --env OPENAI_API_KEY=$OPENAI_API_KEY -v $HOME/.opsmate:/root/.opsmate ghcr.io/opsmate-ai/opsmate:latest"
# From source
git clone git@github.com:opsmate-ai/opsmate.git
cd opsmate
uv build
pipx install ./dist/opsmate-*.whl
Configuration
Opsmate is powered by large language models. It currently supports:
Set up your API key in an environment variable:
export OPENAI_API_KEY="sk-proj..."
# or
export ANTHROPIC_API_KEY="sk-ant-api03-..."
# or
export XAI_API_KEY="xai-..."
Quick Start
Run commands with natural language
$ opsmate run "what's the gpu of the vm"
# Output: Command and result showing GPU information
Solve complex production issues
$ opsmate solve "what's the k8s distro of the current context"
# Output: Thought process and analysis determining K8s distribution
Interactive chat mode
$ opsmate chat
Web UI and API
$ opsmate serve
# Web interface: http://localhost:8080
# API documentation: http://localhost:8080/api/docs
Advanced Usage
Opsmate can be deployed in production environments using the opsmate-operator
in a Kubernetes cluster, providing:
- Task scheduling via CRDs
- Dedicated HTTPS endpoints and web UI for tasks
- Multi-tenancy support
- Automatic resource management with TTL
- API server for environment management
Check our production documentation for details.
Use Cases
Opsmate supports various use cases:
- Production issue troubleshooting and resolution
- Root cause analysis
- Performance analysis and improvement
- Observability and monitoring setup
- Capacity planning
- On-call engineer assistance
- Infrastructure as Code management
- Routine task automation (CI/CD, backups, updates)
- Knowledge management
- Workflow orchestration
Integrations
For a comprehensive list of integrations, please refer to the integrations and cookbooks sections.
Documentation
For comprehensive documentation, visit here.
Contributing
Contributions are welcome! See our development guide for details.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file opsmate-0.2.0a0.tar.gz
.
File metadata
- Download URL: opsmate-0.2.0a0.tar.gz
- Upload date:
- Size: 2.0 MB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e2c62d78ef43853e2c60d5116776f4d5ebe204e5ae40c65ff68a691307c1475a |
|
MD5 | f3f3f2200ac94c0aa1b86a60b112002f |
|
BLAKE2b-256 | f30d90804e204bddd729dc038dfe8a9c1417cd79c956a9df7460a1f3879fdc85 |
Provenance
The following attestation bundles were made for opsmate-0.2.0a0.tar.gz
:
Publisher:
publish.yml
on opsmate-ai/opsmate
-
Statement:
- Statement type:
https://in-toto.io/Statement/v1
- Predicate type:
https://docs.pypi.org/attestations/publish/v1
- Subject name:
opsmate-0.2.0a0.tar.gz
- Subject digest:
e2c62d78ef43853e2c60d5116776f4d5ebe204e5ae40c65ff68a691307c1475a
- Sigstore transparency entry: 202180109
- Sigstore integration time:
- Permalink:
opsmate-ai/opsmate@6d693cb74e4bff8a3c3e1c3c374f893ff652235d
- Branch / Tag:
refs/heads/main
- Owner: https://github.com/opsmate-ai
- Access:
public
- Token Issuer:
https://token.actions.githubusercontent.com
- Runner Environment:
github-hosted
- Publication workflow:
publish.yml@6d693cb74e4bff8a3c3e1c3c374f893ff652235d
- Trigger Event:
push
- Statement type:
File details
Details for the file opsmate-0.2.0a0-py3-none-any.whl
.
File metadata
- Download URL: opsmate-0.2.0a0-py3-none-any.whl
- Upload date:
- Size: 191.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 031eeff2ce7982ad3d5ed9459a7cfac5a2c7b7955ab923ad2dc0cc8c9cd01f5f |
|
MD5 | caf4835691dc8dbeccb0c018498406de |
|
BLAKE2b-256 | e0b2edce69ce589522a5fa1029b4904fcc92b207fcde55c3adb54d991464d006 |
Provenance
The following attestation bundles were made for opsmate-0.2.0a0-py3-none-any.whl
:
Publisher:
publish.yml
on opsmate-ai/opsmate
-
Statement:
- Statement type:
https://in-toto.io/Statement/v1
- Predicate type:
https://docs.pypi.org/attestations/publish/v1
- Subject name:
opsmate-0.2.0a0-py3-none-any.whl
- Subject digest:
031eeff2ce7982ad3d5ed9459a7cfac5a2c7b7955ab923ad2dc0cc8c9cd01f5f
- Sigstore transparency entry: 202180111
- Sigstore integration time:
- Permalink:
opsmate-ai/opsmate@6d693cb74e4bff8a3c3e1c3c374f893ff652235d
- Branch / Tag:
refs/heads/main
- Owner: https://github.com/opsmate-ai
- Access:
public
- Token Issuer:
https://token.actions.githubusercontent.com
- Runner Environment:
github-hosted
- Publication workflow:
publish.yml@6d693cb74e4bff8a3c3e1c3c374f893ff652235d
- Trigger Event:
push
- Statement type: