Time Machine for AI Agents โ Cognitive Version Control for LLM context
Project description
๐ง Cognitive Version Control
Git for the AI Mind
Save. Branch. Rewind. Merge. โ Your AI agent just got an undo button.
pip install tm-ai
๐ค Agent CLI ยท โจ Features ยท ๐ Quick Start ยท ๐ CLI Reference ยท ๐ค Contributing
Your AI coding agent is brilliant โ for about 20 minutes.
Then it forgets what it already fixed, contradicts its own plan, and loops on the same error for eternity.
Sound familiar?
๐ง What Is This?
CVC gives AI coding agents something they've never had: memory management that actually works.
Git, but for the AI's brain. Instead of versioning source code, CVC versions the agent's entire context โ every thought, every decision, every conversation turn โ as an immutable, cryptographic Merkle DAG.
The agent can checkpoint its reasoning, branch into risky experiments, rewind when stuck, and merge only the insights that matter.
| ๐พ Save | ๐ฟ Branch | ๐ Merge | โช Rewind |
|---|---|---|---|
| Checkpoint the agent's brain at any stable moment. | Explore risky ideas in isolation. Main context stays clean. | Merge learnings back โ not raw logs. Semantic, not syntactic. | Stuck in a loop? Time-travel back instantly. |
๐ค CVC Agent โ Your Own AI Coding Assistant
Claude Code on steroids โ with Time Machine built in.
Just type cvc and you're in. No setup menus, no extra commands.
cvc
That's it. One command. The agent launches.
CVC ships with a full agentic coding assistant directly in your terminal โ like Claude Code, but with the ability to save, branch, rewind, and search through your entire conversation history. It's not just an AI chat โ it's an AI with cognitive version control.
๐ง 15 Built-in Tools
The agent has access to powerful tools that let it work directly on your codebase:
| Icon | Tool | What It Does |
|---|---|---|
| ๐ | read_file | Read files with optional line ranges for large files |
| โ๏ธ | write_file | Create or overwrite files, auto-creates directories |
| ๐ง | edit_file | Precise find-and-replace edits with uniqueness validation |
| ๐ฅ๏ธ | bash | Run shell commands (PowerShell on Windows, bash on Unix) |
| ๐ | glob | Find files by pattern (**/*.py, src/**/*.ts) |
| ๐ | grep | Search file contents with regex + include filters |
| ๐ | list_dir | List directory contents to explore project structure |
| ๐ | cvc_status | Show current branch, HEAD, and context state |
| ๐ | cvc_log | View commit history โ snapshots of the conversation |
| ๐พ | cvc_commit | Save a checkpoint of the current conversation state |
| ๐ฟ | cvc_branch | Create a branch to explore alternatives safely |
| โช | cvc_restore | Time-travel back to any previous conversation state |
| ๐ | cvc_merge | Merge insights from one branch into another |
| ๐ | cvc_search | Search commit history for specific topics or discussions |
| ๐ | cvc_diff | Compare conversation states between commits |
โจ๏ธ Slash Commands
While chatting with the agent, use these slash commands for quick actions:
| Command | Description |
|---|---|
/help | Show all available slash commands |
/status | View branch, HEAD, context size, provider & model |
/log | Show last 20 conversation checkpoints |
/commit <message> | Save a manual checkpoint of the conversation |
/branch <name> | Create and switch to a new conversation branch |
/restore <hash> | Time-travel back to a specific checkpoint |
/search <query> | Search all commits for a topic (e.g., /search auth login) |
/compact | Compress the conversation history, keeping recent context |
/clear | Clear conversation history (CVC state preserved) |
/model <name> | Switch LLM model mid-conversation |
/exit | Save final checkpoint and exit cleanly |
๐ง What Makes It Different
| Claude Code / Codex | Aider / Cursor | ๐ฅ CVC Agent |
|---|---|---|
|
|
|
๐จ Agent Options
cvc # Launch agent with saved config
cvc agent # Same thing โ explicit subcommand
cvc agent --provider anthropic # Force a specific provider
cvc agent --model claude-sonnet-4-5 # Override the model
cvc agent --api-key sk-ant-... # Pass API key directly
๐ Auto-Commit
The agent automatically saves checkpoints every 5 assistant turns (CVC_AGENT_AUTO_COMMIT=5).
When you exit with /exit, a final checkpoint is saved. You never lose context.
๐ฅ The Problem We're Solving
The industry keeps making context windows bigger โ 4K โ 32K โ 128K โ 1M+ tokens
It's not progress.
Research shows that after ~60% context utilisation, LLM reasoning quality falls off a cliff. One hallucination poisons everything that follows. Error cascades compound. The agent starts fighting itself.
A bigger window doesn't fix context rot. It just gives it more room to spread.
The Real Issue
AI agents have zero ability to manage their own cognitive state. They can't save their work. They can't explore safely. They can't undo mistakes. They're solving a 500-piece puzzle while someone keeps removing pieces from the table.
๐ What the Research Shows
| 58.1% Context reduction via branching |
3.5ร Success rate improvement with rollback |
~90% Cost reduction through caching |
~85% Latency reduction |
| ContextBranch paper | GCC paper | Prompt caching | Cached tokens skip processing |
โ๏ธ How It Works
CVC operates in two modes: as a standalone agent (just type cvc) or as a proxy between your favourite AI tool and the LLM provider.
%%{init: {'theme': 'dark', 'themeVariables': { 'fontSize': '14px', 'primaryColor': '#2C0000', 'primaryTextColor': '#E8D0D0', 'primaryBorderColor': '#8B0000', 'lineColor': '#CC3333', 'secondaryColor': '#5C1010', 'tertiaryColor': '#3D0000', 'edgeLabelBackground': '#2C0000'}}}%%
flowchart LR
subgraph LOCAL["๐ฅ๏ธ YOUR MACHINE"]
direction TB
subgraph AGENT_MODE[" CVC Agent (cvc) "]
AG["๐ค Terminal Agent\n15 tools ยท 4 providers"]
end
subgraph PROXY_MODE[" CVC Proxy ยท localhost:8000 "]
direction TB
R["โก LangGraph Router"]
R -->|CVC ops| E["๐ง Cognitive Engine"]
R -->|passthrough| FWD["๐ก Forward to LLM"]
end
subgraph STORAGE[" .cvc/ directory "]
direction LR
S1["๐๏ธ SQLite\nCommit Graph"]
S2["๐ฆ CAS Blobs\nZstandard"]
S3["๐ Chroma\nVectors"]
end
AG -- "direct API" --> CLOUD
AG --> E
R -- "HTTP" --> CLOUD
E --> S1 & S2 & S3
end
subgraph CLOUD[" โ๏ธ LLM Provider "]
direction TB
C1["Claude"]
C2["GPT-5.2"]
C3["Gemini 3 Pro"]
C4["Ollama ๐ "]
end
style LOCAL fill:#1a0505,stroke:#8B0000,stroke-width:2px,color:#E8D0D0
style AGENT_MODE fill:#2C0000,stroke:#CC3333,stroke-width:2px,color:#E8D0D0
style PROXY_MODE fill:#2C0000,stroke:#8B0000,stroke-width:1px,color:#E8D0D0
style STORAGE fill:#1a0505,stroke:#5C1010,stroke-width:1px,color:#E8D0D0
style CLOUD fill:#1a0505,stroke:#CC3333,stroke-width:2px,color:#E8D0D0
style AG fill:#8B0000,stroke:#CC3333,color:#ffffff
style R fill:#5C1010,stroke:#CC3333,color:#ffffff
style E fill:#3D5C10,stroke:#55AA55,color:#ffffff
style FWD fill:#5C1010,stroke:#CC3333,color:#ffffff
style S1 fill:#2C0000,stroke:#BB8844,color:#E8D0D0
style S2 fill:#2C0000,stroke:#BB8844,color:#E8D0D0
style S3 fill:#2C0000,stroke:#BB8844,color:#E8D0D0
style C1 fill:#8B0000,stroke:#CC3333,color:#ffffff
style C2 fill:#5C1010,stroke:#CC3333,color:#ffffff
style C3 fill:#3D5C10,stroke:#55AA55,color:#ffffff
style C4 fill:#444444,stroke:#888888,color:#ffffff
๐ฏ Three-Tiered Storage (All Local)
| Tier | What | Why |
|---|---|---|
| ๐๏ธ SQLite | Commit graph, branch pointers, metadata | Fast traversal, zero-config, works everywhere |
| ๐ฆ CAS Blobs | Compressed context snapshots (Zstandard) | Content-addressable, deduplicated, efficient |
| ๐ Chroma | Semantic embeddings (optional) | "Have I solved this before?" โ search by meaning |
โจ Everything stays in .cvc/ inside your project
๐ No cloud โข No telemetry โข Your agent's thoughts are yours
๐ Get Started
Prerequisites
Python 3.11+ โข Git (for VCS bridge features)
๐ฆ Install
Available on PyPI โ install in one command, no cloning required.
pip install tm-ai
That's it. The cvc command is now available globally.
๐ง More install options
# With uv (faster)
uv pip install tm-ai
# As an isolated uv tool (always on PATH, no venv needed)
uv tool install tm-ai
# With provider extras
pip install "tm-ai[anthropic]" # Anthropic (Claude)
pip install "tm-ai[openai]" # OpenAI (GPT)
pip install "tm-ai[google]" # Google (Gemini)
pip install "tm-ai[all]" # Everything
๐ ๏ธ For contributors / local development
git clone https://github.com/mannuking/AI-Cognitive-Version-Control.git
cd AI-Cognitive-Version-Control
uv sync --extra dev # or: pip install -e ".[dev]"
โถ๏ธ Run
The simplest way โ just type cvc:
cvc
This launches the CVC Agent directly. If it's your first time, you'll be guided through setup first (pick your provider, model, and API key).
Or use specific commands:
# Launch the agent explicitly
cvc agent
cvc agent --provider openai --model gpt-5.2
# Launch external AI tools through CVC's proxy
cvc launch claude # Claude Code CLI
cvc launch aider # Aider
cvc launch codex # OpenAI Codex CLI
cvc launch cursor # Cursor IDE
cvc launch code # VS Code
# One-command start (setup + init + serve proxy)
cvc up
Cross-platform: Works on Windows, macOS, and Linux. Global config is stored in the platform-appropriate location:
- Windows:
%LOCALAPPDATA%\cvc\config.json- macOS:
~/Library/Application Support/cvc/config.json- Linux:
~/.config/cvc/config.json
๐ Set Your API Key
| Provider | Bash / Linux / macOS | PowerShell |
|---|---|---|
| Anthropic | export ANTHROPIC_API_KEY="sk-ant-..." |
$env:ANTHROPIC_API_KEY = "sk-ant-..." |
| OpenAI | export OPENAI_API_KEY="sk-..." |
$env:OPENAI_API_KEY = "sk-..." |
export GOOGLE_API_KEY="AIza..." |
$env:GOOGLE_API_KEY = "AIza..." |
|
| Ollama | No key needed โ just run ollama serve and ollama pull qwen2.5-coder:7b |
|
Or save your keys via cvc setup โ they're stored securely on your machine.
๐ Connect External AI Tools (Proxy Mode)
If you prefer to use your own AI tool instead of the built-in agent, CVC runs as a transparent proxy that time-machines every conversation:
API-Based Tools (Proxy Mode)
Point your AI agent's API base URL to http://127.0.0.1:8000
CVC exposes OpenAI-compatible (/v1/chat/completions) AND Anthropic-native (/v1/messages) endpoints.
Auth-Based Tools (MCP Mode)
For IDEs that use login authentication (Antigravity, Windsurf, native Copilot), CVC runs as an MCP server:
cvc mcp # Start MCP server (stdio transport)
cvc mcp --transport sse # Start MCP server (HTTP/SSE transport)
| Tool | Auth Type | How to Connect |
|---|---|---|
| ๐ VS Code + Copilot | GitHub Login | BYOK: Ctrl+Shift+P โ Manage Models โ OpenAI Compatible or MCP: cvc mcp |
| ๐ Antigravity | Google Login | MCP only: add cvc in MCP settings โ cvc mcp |
| ๐ฑ๏ธ Cursor | API Key Override | Settings โ Models โ Override Base URL โ http://127.0.0.1:8000/v1 |
| ๐ Windsurf | Account Login | MCP only: add cvc in Cascade MCP settings โ cvc mcp |
| ๐ Claude Code CLI | API Key | export ANTHROPIC_BASE_URL=http://127.0.0.1:8000 โ native /v1/messages |
| โจ๏ธ Codex CLI | API Key | model_provider = "cvc" in ~/.codex/config.toml |
| ๐ Continue.dev / ๐ค Cline | API Key | Base URL โ http://127.0.0.1:8000/v1, API Key โ cvc |
| ๐ ๏ธ Aider / ๐ Open WebUI | API Key | Standard OpenAI-compatible endpoint |
| ๐ฆ LangChain / CrewAI / AutoGen | API Key | Use CVC's function-calling tools (GET /cvc/tools) |
Auth pass-through: When Claude Code or Codex CLI sends its own API key, CVC forwards it to the upstream provider. No need to store API keys in CVC for these tools.
Run cvc connect for interactive, tool-specific setup instructions.
๐ CLI Reference
| Command | Description |
|---|---|
cvc |
Launch the CVC Agent โ interactive AI coding assistant |
cvc agent |
Same as above (explicit subcommand) |
cvc agent --provider <p> |
Agent with a specific provider (anthropic, openai, google, ollama) |
cvc agent --model <m> |
Agent with a model override |
| โโโโ Launch External Tools โโโโ | |
cvc launch <tool> |
Zero-config โ auto-launch any AI tool through CVC |
cvc up |
One command: setup + init + serve proxy |
| โโโโ Setup & Configuration โโโโ | |
cvc setup |
Interactive setup wizard (choose provider & model) |
cvc init |
Initialize .cvc/ in your project |
cvc serve |
Start the Cognitive Proxy (API-based tools) |
cvc mcp |
Start MCP server (auth-based IDEs) |
cvc connect |
Interactive tool connection wizard |
| โโโโ Time Machine โโโโ | |
cvc status |
Show branch, HEAD, context size |
cvc log |
View commit history |
cvc commit -m "message" |
Create a cognitive checkpoint |
cvc branch <name> |
Create an exploration branch |
cvc merge <branch> |
Semantic merge into active branch |
cvc restore <hash> |
Time-travel to a previous state |
cvc sessions |
View Time Machine session history |
| โโโโ Utilities โโโโ | |
cvc install-hooks |
Install Git โ CVC sync hooks |
cvc capture-snapshot |
Link current Git commit to CVC state |
cvc doctor |
Health check your environment |
๐ Git Integration
CVC doesn't replace Git โ it bridges with it.
| Feature | What It Does |
|---|---|
| ๐ฒ Shadow Branches | CVC state lives on cvc/main, keeping your main branch clean |
| ๐ Git Notes | Every git commit is annotated with the CVC hash โ "What was the AI thinking when it wrote this?" |
| ๐ post-commit hook | Auto-captures cognitive state after every git commit |
| โฐ post-checkout hook | Auto-restores the agent's brain when you git checkout an old commit |
๐ When you check out an old version of your code, CVC automatically restores the agent's context to what it was when that code was written.
โจ True cognitive time-travel.
โฑ๏ธ Time Machine Mode
Like macOS Time Machine, but for AI agent conversations.
Every conversation is automatically saved. Nothing is ever lost.
When you use cvc (the agent) or cvc launch, Time Machine mode is enabled by default:
| Feature | Description |
|---|---|
| Auto-commit | Every 5 assistant turns (agent) or 3 turns (proxy), configurable |
| Session tracking | Detects which tool is connected, tracks start/end, message counts |
| Smart messages | Auto-commits include turn number and conversation summary |
| Zero friction | Just cvc and go โ or cvc launch claude for external tools |
| Session persistence | Context restored from CVC on next launch |
# View session history
cvc sessions
# Customize auto-commit interval (agent)
CVC_AGENT_AUTO_COMMIT=3 cvc agent # Commit every 3 turns
# Customize auto-commit interval (proxy)
CVC_TIME_MACHINE_INTERVAL=5 cvc up # Commit every 5 turns
# Disable time machine for external tools
cvc launch claude --no-time-machine
Supported External Tools
| Tool | Launch Command | How It Connects |
|---|---|---|
| Claude Code CLI | cvc launch claude |
Sets ANTHROPIC_BASE_URL โ native /v1/messages |
| Aider | cvc launch aider |
Sets OPENAI_API_BASE + model flag |
| OpenAI Codex CLI | cvc launch codex |
Sets OPENAI_API_BASE |
| Gemini CLI | cvc launch gemini |
Sets GEMINI_API_BASE_URL |
| Kiro CLI | cvc launch kiro |
Sets OPENAI_API_BASE |
| Cursor | cvc launch cursor |
Writes .cursor/mcp.json + opens IDE |
| VS Code | cvc launch code |
Writes .vscode/mcp.json + configures BYOK |
| Windsurf | cvc launch windsurf |
Writes MCP config + opens IDE |
โก Why It's Cheap
CVC structures prompts so committed history becomes a cacheable prefix. When you rewind to a checkpoint, the model doesn't reprocess anything it's already seen.
| Metric | โ Without CVC | โ With CVC |
|---|---|---|
| ๐ฐ Cost per restore | Full price | ~90% cheaper |
| โก Latency per restore | Full processing | ~85% faster |
| ๐ Checkpoint frequency | Impractical | Economically viable |
๐ฅ Works today with Anthropic, OpenAI, Google Gemini, and Ollama ๐ก Prompt caching optimised per provider
๐ค Supported Providers
Pick your provider. CVC handles the rest.
| Provider | Default Model | Alternatives | Notes |
|---|---|---|---|
| Anthropic | claude-opus-4-6 |
claude-opus-4-5, claude-sonnet-4-5, claude-haiku-4-5 |
Prompt caching with cache_control |
| OpenAI | gpt-5.2 |
gpt-5.2-codex, gpt-5-mini, gpt-4.1 |
Automatic prefix caching |
gemini-3-pro-preview |
gemini-2.5-pro, gemini-2.5-flash, gemini-2.5-flash-lite |
OpenAI-compatible endpoint | |
| Ollama | qwen2.5-coder:7b |
qwen3-coder:30b, devstral:24b, deepseek-r1:8b |
100% local, no API key needed |
โ๏ธ Configuration
All via environment variables โ no config files to manage
| Variable | Default | What It Does |
|---|---|---|
CVC_AGENT_ID |
sofia |
Agent identifier |
CVC_DEFAULT_BRANCH |
main |
Default branch |
CVC_ANCHOR_INTERVAL |
10 |
Full snapshot every N commits (others are delta-compressed) |
CVC_PROVIDER |
anthropic |
LLM provider |
CVC_MODEL |
auto | Model name (auto-detected per provider) |
CVC_AGENT_AUTO_COMMIT |
5 |
Agent auto-checkpoint interval (turns) |
CVC_TIME_MACHINE_INTERVAL |
3 |
Proxy auto-commit interval (turns) |
ANTHROPIC_API_KEY |
โ | Required for anthropic provider |
OPENAI_API_KEY |
โ | Required for openai provider |
GOOGLE_API_KEY |
โ | Required for google provider |
CVC_HOST |
127.0.0.1 |
Proxy host |
CVC_PORT |
8000 |
Proxy port |
CVC_VECTOR_ENABLED |
false |
Enable semantic search (Chroma) |
๐๏ธ Architecture
cvc/
โโโ __init__.py # Package root, version
โโโ __main__.py # python -m cvc entry point
โโโ cli.py # Click CLI โ all commands, setup wizard, dark red theme
โโโ proxy.py # FastAPI proxy โ intercepts LLM API calls
โโโ launcher.py # Zero-config auto-launch for AI tools
โโโ mcp_server.py # Model Context Protocol server
โ
โโโ agent/ # โ
Built-in AI coding agent (v0.6.0)
โ โโโ __init__.py # Exports run_agent()
โ โโโ chat.py # AgentSession REPL loop, slash commands, auto-commit
โ โโโ llm.py # Unified LLM client โ tool calling for all 4 providers
โ โโโ tools.py # 15 tool definitions in OpenAI function-calling schema
โ โโโ executor.py # Tool execution engine โ file ops, shell, CVC operations
โ โโโ system_prompt.py # Dynamic Claude Code-style system prompt builder
โ โโโ renderer.py # Rich terminal rendering with #2C0000 dark red theme
โ
โโโ adapters/ # Provider-specific prompt formatting
โ โโโ base.py # Abstract BaseAdapter
โ โโโ anthropic.py # Anthropic adapter (prompt caching)
โ โโโ openai.py # OpenAI adapter
โ โโโ google.py # Google adapter
โ โโโ ollama.py # Ollama adapter
โ
โโโ core/ # Data layer
โ โโโ models.py # Pydantic schemas, config, Merkle DAG
โ โโโ database.py # SQLite + CAS + Chroma storage
โ
โโโ operations/ # CVC engine
โ โโโ engine.py # Commit, branch, merge, restore
โ โโโ state_machine.py # LangGraph command routing
โ
โโโ vcs/ # Git bridge
โโโ bridge.py # Shadow branches, Git notes, hooks
๐ฏ Who Is This For?
| ๐ค Solo Developers | ๐ข Teams & Organizations | ๐ Open Source |
|---|---|---|
|
Your AI stops losing context mid-session. Explore multiple approaches. Undo mistakes. Never re-explain the same thing twice. |
Review the AI's reasoning, not just its output. Cryptographic audit trails. Shared cognitive state across team members. Compliance-ready. |
See how an AI-generated PR was produced. Inspect for hallucination patterns. Build project knowledge bases from commit embeddings. |
๐บ๏ธ Roadmap
| Feature | Status |
|---|---|
| ๐ค Built-in Agent CLI | โ Shipped in v0.6.0 โ 15 tools, 4 providers, slash commands |
| โ๏ธ Anthropic Adapter | โ Claude Opus 4.6 / 4.5 / Sonnet / Haiku |
| โ๏ธ OpenAI Adapter | โ GPT-5.2 / GPT-5.2-Codex / GPT-5-mini |
| โ๏ธ Google Gemini Adapter | โ Gemini 3 Pro Preview / 2.5 Pro / 2.5 Flash |
| ๐ Ollama (Local) | โ Qwen 2.5 Coder / Qwen 3 Coder / DeepSeek-R1 / Devstral |
| ๐ MCP Server | โ Native Model Context Protocol (stdio + SSE) |
| ๐ Zero-config Launch | โ
cvc launch claude / aider / codex / cursor / etc. |
| ๐ Git Bridge | โ Shadow branches, Git notes, auto-hooks |
| ๐จ VS Code Extension | ๐ Visual commit graph and time-travel slider |
| ๐ฅ Multi-agent support | ๐ Shared CVC database with conflict resolution |
| โ๏ธ Cloud sync | ๐ S3/MinIO for team collaboration |
| ๐ Metrics dashboard | ๐ Cache hit rates, context utilisation, branch success rates |
๐ค Contributing
This repo is public and open to collaboration.
Whether you're fixing a typo or building an entirely new provider adapter โ contributions are welcome.
Fork โ Branch โ Commit โ Push โ PR
๐ฏ Areas Where Help Is Needed
| Area | Difficulty |
|---|---|
| ๐ Additional Provider Adapters (Mistral, Cohere, etc.) | ๐ก Medium |
| ๐งช Tests & edge cases | ๐ข EasyโMedium |
| ๐ฅ๏ธ VS Code Extension (commit graph visualisation) | ๐ด Hard |
| ๐ Metrics & observability dashboard | ๐ก Medium |
| ๐ Security audit | ๐ MediumโHard |
๐ ๏ธ Dev Setup
git clone https://github.com/YOUR_USERNAME/AI-Cognitive-Version-Control.git
cd AI-Cognitive-Version-Control
uv sync --extra dev
๐ Research
CVC is grounded in published research
| Paper | Key Finding |
|---|---|
| ContextBranch | 58.1% context reduction via branching |
| GCC | 11.7% โ 40.7% success with rollback |
| Merkle-CRDTs | Structural deduplication for DAGs |
| Prompt Caching | Anthropic/OpenAI/Google token reuse |
๐ License
MIT โ see LICENSE
โจ Because AI agents deserve an undo button. โจ
โญ Star this repo if you believe in giving AI agents memory that actually works.
Made with โค๏ธ by developers who got tired of AI agents forgetting what they just did.
โญ Star ยท ๐ Bug ยท ๐ก Feature ยท ๐ PR
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file tm_ai-1.1.5.tar.gz.
File metadata
- Download URL: tm_ai-1.1.5.tar.gz
- Upload date:
- Size: 588.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.18
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a7927a63a9ccf9078ea2ba78232ec10376717a8a33732d46ae20f625fb9f6478
|
|
| MD5 |
19ae2eed8ad4c4f915a3438c3abb479c
|
|
| BLAKE2b-256 |
409ffe64cc8646aa93331c93e1f1803005660c4d92f4858b34dfdd4f6af8b15d
|
File details
Details for the file tm_ai-1.1.5-py3-none-any.whl.
File metadata
- Download URL: tm_ai-1.1.5-py3-none-any.whl
- Upload date:
- Size: 137.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.18
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c886f89f507af87bdc0f32d81cfd6e236bf712713cb0d8a20d84c32d38858142
|
|
| MD5 |
63bfc445440e2766103cc466ceee04b7
|
|
| BLAKE2b-256 |
e1db5a0a06fe5b7acd7351b93e8c9c1cc758197c8e1013915eab99d0c19ed7ea
|