AI-powered Rust code generation, compilation, and error fixing MCP server
Project description
Rust Coder
AI agents could autonomously write and execute software programs in order to accomplish their own tasks and goals. The Rust language is a perfect fit for AI agents -- as the powerful Rust compiler can provide real-time and autonomous feedback to the agent to ensure the validity of the generated code. The Rust Coder project is an open source effort to provide such tools. It consists of API and MCP services that generate fully functional Rust projects from natural language descriptions. These services leverage LLMs to create complete Rust cargo projects, compile Rust source code, and automatically fix compiler errors.
โจ Features
- Generate Rust Projects ๐ฆ - Transform text descriptions into complete Rust projects.
- Automatic Compilation & Fixing ๐ - Detect and resolve errors during compilation.
- Vector Search ๐ - Search for similar projects and errors.
- Docker Containerization ๐ณ - Easy deployment with Docker.
- Asynchronous Processing โณ - Handle long-running operations efficiently.
- Multiple Service Interfaces ๐ - REST API and MCP (Model-Compiler-Processor) interface.
๐ Prerequisites
Ensure you have the following installed:
- Docker & Docker Compose ๐ณ
Or, if you want to run the services directly on your own computer:
- Python 3.8+ ๐
- Rust Compiler and cargo tools ๐ฆ
๐ฆ Install
git clone https://github.com/WasmEdge/Rust_coder
cd Rust_coder
๐ Configure and run
Using Docker (Recommended)
Create the .env file and specify your own LLM API server. The default config assumes that you have a Gaia node like this running on localhost port 8080. The alternative configuration shown below uses a public Gaia node for coding assistance.
LLM_API_BASE=https://0x9fcf7888963793472bfcb8c14f4b6b47a7462f17.gaia.domains/v1
LLM_MODEL=gemma-3-27b-it-q4_0
LLM_EMBED_MODEL=nomic-embed
LLM_API_KEY=1234ABCD
LLM_EMBED_SIZE=768
Start the services.
docker-compose up -d
Stop the services.
docker-compose stop
Manual Setup
By default, you will need a Qdrant server running on localhost port 6333. You also need a local Gaia node. Set the following environment variables in your terminal to point to the Qdrant and Gaia instances, as well as your Rust compiler tools.
QDRANT_HOST=localhost
QDRANT_PORT=6333
LLM_API_BASE=http://localhost:8080/v1
LLM_MODEL=Qwen2.5-Coder-3B-Instruct
LLM_EMBED_MODEL=nomic-embed
LLM_API_KEY=your_api_key
LLM_EMBED_SIZE=768
CARGO_PATH=/path/to/cargo
RUST_COMPILER_PATH=/path/to/rustc
Start the services.
pip install -r requirements.txt
uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload
๐ฅ Usage
The API provides the following endpoints:
๐ฏ Generate a Project
Endpoint: POST /generate-sync
Example:
curl -X POST http://localhost:8000/generate-sync \
-H "Content-Type: application/json" \
-d '{"description": "A command-line calculator in Rust", "requirements": "Should support addition, subtraction, multiplication, and division"}'
๐ฅ Request Body:
{
"description": "A command-line calculator in Rust",
"requirements": "Should support addition, subtraction, multiplication, and division"
}
๐ค Response:
The combined_text field contains the flat text output of Rust project files that can be used as input for /compile and /compile-and-fix API calls.
{
"success": true,
"message":"Project generated successfully",
"combined_text":"[filename: Cargo.toml]\n[package]\nname = \"calculator\"\nversion = \"0.1.0\"\nedition = \"2021\"\n\n# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html\n\n[dependencies]\nclap = { version = \"4.5\", features = [\"derive\"] }\n\n[filename: src/main.rs]\nuse std::io;\nuse clap::Parser;\n\n#[derive(Parser, Debug)]\n#[command(author, version, about, long_about = None)]\nstruct Args {\n /// The first number\n #[arg(required = true)]\n num1: f64,\n /// Operator (+, -, *, /)\n #[arg(required = true, value_parser = clap::value_parser!(f64))]\n operator: String,\n /// The second number\n #[arg(required = true)]\n num2: f64,\n}\n\nfn main() -> Result<(), Box<dyn std::error::Error>> {\n let args = Args::parse();\n\n match args.operator.as_str() {\n \"+\" => {\n println!(\"{}\", args.num1 + args.num2);\n }\n \"-\" => {\n println!(\"{}\", args.num1 - args.num2);\n }\n \"*\" => {\n println!(\"{}\", args.num1 * args.num2);\n }\n \"/\" => {\n if args.num2 == 0.0 {\n eprintln!(\"Error: Cannot divide by zero.\");\n std::process::exit(1);\n }\n println!(\"{}\", args.num1 / args.num2);\n }\n _ => {\n eprintln!(\"Error: Invalid operator. Use +, -, *, or /\");\n std::process::exit(1);\n }\n }\n\n Ok(())\n}\n\n[filename: README.md]\n# Calculator\n\nA simple command-line calculator written in Rust. Supports addition, subtraction, multiplication, and division.\n\n## Usage\n\nRun the program with two numbers and an operator as arguments:\n\n```bash\ncargo run <num1> <operator> <num2>\n```\n\nWhere `<operator>` is one of `+`, `-`, `*`, or `/`.\n\n**Example:**\n\n```bash\ncargo run 5 + 3\n# Output: 8\n\ncargo run 10 / 2\n# Output: 5\n\ncargo run 7 * 4\n# Output: 28\n```\n\n## Error Handling\n\nThe calculator handles division by zero and invalid operator inputs. Error messages are printed to standard error, and the program exits with a non-zero exit code in case of an error.\n\n\n# Build succeeded",
"files":{
"Cargo.toml":"[package]\nname = \"calculator\"\nversion = \"0.1.0\"\nedition = \"2021\"\n\n# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html\n\n[dependencies]\nclap = { version = \"4.5\", features = [\"derive\"] }",
"src/main.rs":"use std::io;\nuse clap::Parser;\n\n#[derive(Parser, Debug)]\n#[command(author, version, about, long_about = None)]\nstruct Args {\n /// The first number\n #[arg(required = true)]\n num1: f64,\n /// Operator (+, -, *, /)\n #[arg(required = true, value_parser = clap::value_parser!(f64))]\n operator: String,\n /// The second number\n #[arg(required = true)]\n num2: f64,\n}\n\nfn main() -> Result<(), Box<dyn std::error::Error>> {\n let args = Args::parse();\n\n match args.operator.as_str() {\n \"+\" => {\n println!(\"{}\", args.num1 + args.num2);\n }\n \"-\" => {\n println!(\"{}\", args.num1 - args.num2);\n }\n \"*\" => {\n println!(\"{}\", args.num1 * args.num2);\n }\n \"/\" => {\n if args.num2 == 0.0 {\n eprintln!(\"Error: Cannot divide by zero.\");\n std::process::exit(1);\n }\n println!(\"{}\", args.num1 / args.num2);\n }\n _ => {\n eprintln!(\"Error: Invalid operator. Use +, -, *, or /\");\n std::process::exit(1);\n }\n }\n\n Ok(())\n}",
"README.md":"# Calculator\n\nA simple command-line calculator written in Rust. Supports addition, subtraction, multiplication, and division.\n\n## Usage\n\nRun the program with two numbers and an operator as arguments:\n\n```bash\ncargo run <num1> <operator> <num2>\n```\n\nWhere `<operator>` is one of `+`, `-`, `*`, or `/`.\n\n**Example:**\n\n```bash\ncargo run 5 + 3\n# Output: 8\n\ncargo run 10 / 2\n# Output: 5\n\ncargo run 7 * 4\n# Output: 28\n```\n\n## Error Handling\n\nThe calculator handles division by zero and invalid operator inputs. Error messages are printed to standard error, and the program exits with a non-zero exit code in case of an error."
},
"build_output":null,
"build_success":true
}
๐ Compile a Project
Endpoint: POST /compile
Example:
curl -X POST http://localhost:8000/compile \
-H "Content-Type: application/json" \
-d '{
"code": "[filename: Cargo.toml]\n[package]\nname = \"hello_world\"\nversion = \"0.1.0\"\nedition = \"2021\"\n\n[dependencies]\n\n[filename: src/main.rs]\nfn main() {\n println!(\"Hello, World!\");\n}"
}'
๐ฅ Request Body:
{
"code": "[filename: Cargo.toml]\n[package]\nname = \"hello_world\"\nversion = \"0.1.0\"\nedition = \"2021\"\n\n[dependencies]\n\n[filename: src/main.rs]\nfn main() {\n println!(\"Hello, World!\");\n}"
}
๐ค Response:
{
"success": true,
"files": [
"Cargo.toml",
"src/main.rs"
],
"build_output": "Build successful",
"run_output": "Hello, World!\n"
}
๐ฉน Compile and fix errors
Endpoint: POST /compile-and-fix
Example:
curl -X POST http://localhost:8000/compile-and-fix \
-H "Content-Type: application/json" \
-d '{
"code": "[filename: Cargo.toml]\n[package]\nname = \"hello_world\"\nversion = \"0.1.0\"\nedition = \"2021\"\n\n[dependencies]\n\n[filename: src/main.rs]\nfn main() {\n print \"Hello, World!\" \n}",
"description": "A simple hello world program",
"max_attempts": 3
}'
๐ฅ Request Body:
{
"code": "[filename: Cargo.toml]\n[package]\nname = \"hello_world\"\nversion = \"0.1.0\"\nedition = \"2021\"\n\n[dependencies]\n\n[filename: src/main.rs]\nfn main() {\n print \"Hello, World!\" \n}",
"description": "A simple hello world program",
"max_attempts": 3
}
๐ค Response:
The combined_text field contains the flat text output of Rust project files that is in the same format as the input code field.
{
"success": true,
"message":"Code fixed and compiled successfully",
"attempts":[
{
"attempt":1,
"success":false,
"output":" Compiling hello_world v0.1.0 (/tmp/tmpbgeg4x_e)\nerror: expected one of `!`, `.`, `::`, `;`, `?`, `{`, `}`, or an operator, found `\"Hello, World!\"`\n --> src/main.rs:2:11\n |\n2 | print \"Hello, World!\" \n | ^^^^^^^^^^^^^^^ expected one of 8 possible tokens\n\nerror: could not compile `hello_world` (bin \"hello_world\") due to 1 previous error\n"
},
{
"attempt":2,
"success":true,
"output":null
}
],
"combined_text":"[filename: Cargo.toml]\n[package]\nname = \"hello_world\"\nversion = \"0.1.0\"\nedition = \"2021\"\n\n# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html\n\n[dependencies]\n\n[filename: src/main.rs]\nfn main() {\n println!(\"Hello, World!\");\n}\n\n[filename: README.md]\n# Hello World\n\nThis is a simple \"Hello, World!\" program in Rust. It prints the message \"Hello, World!\" to the console.\n\nTo run it:\n\n1. Make sure you have Rust installed ([https://www.rust-lang.org/](https://www.rust-lang.org/)).\n2. Save the code as `src/main.rs`.\n3. Run `cargo run` in the terminal from the project directory.",
"files":{
"Cargo.toml":"[package]\nname = \"hello_world\"\nversion = \"0.1.0\"\nedition = \"2021\"\n\n# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html\n\n[dependencies]",
"src/main.rs":"fn main() {\n println!(\"Hello, World!\");\n}",
"README.md":"# Hello World\n\nThis is a simple \"Hello, World!\" program in Rust. It prints the message \"Hello, World!\" to the console.\n\nTo run it:\n\n1. Make sure you have Rust installed ([https://www.rust-lang.org/](https://www.rust-lang.org/)).\n2. Save the code as `src/main.rs`.\n3. Run `cargo run` in the terminal from the project directory."
},
"build_output":"Build successful",
"run_output":"Hello, World!\n",
"build_success":true
}
๐ฏ Generate a Project Async
Endpoint: POST /generate
Example:
curl -X POST http://localhost:8000/generate \
-H "Content-Type: application/json" \
-d '{"description": "A command-line calculator in Rust", "requirements": "Should support addition, subtraction, multiplication, and division"}'
๐ฅ Request Body:
{
"description": "A command-line calculator in Rust",
"requirements": "Should support addition, subtraction, multiplication, and division"
}
๐ค Response:
{
"project_id": "123e4567-e89b-12d3-a456-426614174000",
"status": "processing",
"message": "Project generation started"
}
๐ Check Project Status
Endpoint: GET /project/{project_id}
Example:
curl http://localhost:8000/project/123e4567-e89b-12d3-a456-426614174000
๐ค Response:
{
"project_id": "123e4567-e89b-12d3-a456-426614174000",
"status": "completed",
"message": "Project generated successfully",
"files": ["Cargo.toml", "src/main.rs", "README.md"],
"build_output": "...",
"run_output": "..."
}
๐ Get Generated Files
Endpoint: GET /project/{project_id}/files/path_to_file
Example:
curl http://localhost:8000/project/123e4567-e89b-12d3-a456-426614174000/files/src/main.rs
๐ค Response:
fn main() {
... ...
}
๐ฅ๏ธ Command Line Interface (CLI)
The RustCoder CLI provides a command-line interface to interact with the API endpoints and manage projects.
๐ฆ Installation
Install the CLI dependencies:
pip install -r requirements.txt
๐ Basic Usage
Run the CLI with:
python -m cli.main --help
Set the server URL via environment variable or flag:
export RUSTCODER_SERVER=http://localhost:8000
# or
python -m cli.main --server http://localhost:8000
๐ Available Commands
๐ Project Generation
Generate a project (async):
python -m cli.main generate --description "A command-line calculator in Rust"
Generate a project (sync):
python -m cli.main generate --description "A web API with actix" --sync
Generate with requirements file:
python -m cli.main generate --description "CLI tool" --requirements requirements.txt
๐ Project Management
Check project status:
python -m cli.main status --project <project_id>
Watch project until completion:
python -m cli.main status --project <project_id> --watch
Download project as zip:
python -m cli.main download --project <project_id> --out my_project.zip
View project file contents:
python -m cli.main cat --project <project_id> --file src/main.rs
๐ ๏ธ Code Compilation & Fixing
Compile Rust code from file:
python -m cli.main compile --code ./artifacts/multifile.txt
Compile Rust code from project ID:
python -m cli.main compile --project <project_id>
Auto-fix compilation errors from file:
python -m cli.main fix --code ./artifacts/multifile.txt --description "hello world" --max-attempts 5
Auto-fix compilation errors from project ID:
python -m cli.main fix --project <project_id> --description "hello world" --max-attempts 5
โน๏ธ Utility Commands
Show version info:
python -m cli.main version
Show version as JSON:
python -m cli.main version --json
Get shell completions:
python -m cli.main completions
๐ Complete Workflow Example
-
Generate a project:
python -m cli.main generate --description "A simple HTTP server" --sync
-
Save the output to a file:
python -m cli.main generate --description "A simple HTTP server" --sync > project.txt
-
Compile the project:
python -m cli.main compile --code project.txt
-
If compilation fails, auto-fix:
python -m cli.main fix --code project.txt --description "A simple HTTP server" --max-attempts 3
๐ Input File Format
The CLI expects multi-file input in the format:
[filename: Cargo.toml]
[package]
name = "my_project"
version = "0.1.0"
edition = "2021"
[filename: src/main.rs]
fn main() {
println!("Hello, world!");
}
๐ Environment Variables
RUSTCODER_SERVER: Base URL of the RustCoder API (default:http://localhost:8000)
๐ Output Formats
- Human-readable: Default output with colors and formatting
- JSON: Use
--jsonflag for machine-readable output - File redirection: Pipe output to files or other commands
๐ง MCP (Model-Compiler-Processor) tools
The MCP server is available via the HTTP SSE transport via the http://localhost:3000/sse URL. The MCP server can be accessed using the cmcp command-line client. To install the cmcp tool,
pip install cmcp
๐ฏ Generate a new project
tool: generate
๐ฅ Request example:
cmcp http://localhost:3000 tools/call name=generate arguments:='{"description": "A command-line calculator in Rust", "requirements": "Should support addition, subtraction, multiplication, and division"}'
๐ค Response:
{
"meta": null,
"content": [
{
"type": "text",
"text": "[filename: Cargo.toml] ... [filename: src/main.rs] ... ",
"annotations": null
}
],
"isError": false
}
๐ฉน Compile and Fix Rust Code
tool: compile_and_fix
๐ฅ Request example:
cmcp http://localhost:3000 tools/call name=compile_and_fix arguments:='{"code": "[filename: Cargo.toml]\n[package]\nname = \"hello_world\"\nversion = \"0.1.0\"\nedition = \"2021\"\n\n[dependencies]\n\n[filename: src/main.rs]\nfn main() {\n println!(\"Hello, World!\" // Missing closing parenthesis \n}" }'
๐ค Response:
{
"meta": null,
"content": [
{
"type": "text",
"text": "[filename: Cargo.toml]\n[package]\nname = \"hello_world\"\nversion = \"0.1.0\"\nedition = \"2021\"\n\n[dependencies]\n\n[filename: src/main.rs]\nfn main() {\n println!(\"Hello, World!\"); \n}",
"annotations": null
}
],
"isError": false
}
๐ Project Structure
Rust_coder_lfx/
โโโ app/ # Application code
โ โโโ compiler.py # Rust compilation handling
โ โโโ llm_client.py # LLM API client
โ โโโ llm_tools.py # Tools for LLM interactions
โ โโโ load_data.py # Data loading utilities
โ โโโ main.py # FastAPI application & endpoints
โ โโโ mcp_tools.py # MCP-specific tools
โ โโโ prompt_generator.py # LLM prompt generation
โ โโโ response_parser.py # Parse LLM responses into files
โ โโโ utils.py # Utility functions
โ โโโ vector_store.py # Vector database interface
โโโ data/ # Data storage
โ โโโ error_examples/ # Error examples for vector search
โ โโโ project_examples/ # Project examples for vector search
โโโ docker-compose.yml # Docker Compose configuration
โโโ Dockerfile # Docker configuration
โโโ requirements.txt # Python dependencies
โโโ .env # Environment variables
๐ง How It Works
Vector Search: The system uses Qdrant for storing and searching project examples and error solutions.
LLM Integration: Communicates with LlamaEdge API for code generation and error fixing via llm_client.py.
Compilation Feedback Loop: Automatically compiles, detects errors, and fixes them using compiler.py.
File Parsing: Converts LLM responses into project files with response_parser.py.
Architecture
REST API Interface (app/main.py): FastAPI application exposing HTTP endpoints for project generation, compilation, and error fixing.
MCP Interface (mcp_server.py, app/mcp_service.py): Server-Sent Events interface for the same functionality.
Vector Database (app/vector_store.py): Qdrant is used for storing and searching similar projects and error examples.
LLM Integration (app/llm_client.py): Communicates with LLM APIs (like Gaia nodes) for code generation and error fixing.
Compilation Pipeline (app/compiler.py): Handles Rust code compilation, error detection, and provides feedback for fixing.
Process Flow
Project Generation:
- User provides a description and requirements
- System creates a prompt using templates
- LLM generates a complete Rust project
- Response is parsed into individual files (
app/response_parser.py) - Project is compiled to verify correctness
Error Fixing:
- System attempts to compile the provided code
- If errors occur, they're extracted and analyzed
- Vector search may find similar past errors
- LLM receives the errors and original code to generate fixes
- Process repeats until successful or max attempts reached
๐ Advanced: Enhancing Performance with Vector Search
The system uses vector embeddings to find similar projects and error examples, which helps improve code generation quality. Here's how to add your own examples:
๐ง Creating Vector Collections
First, you need to create the necessary collections in Qdrant using these curl commands:
# Create project_examples collection with 768 dimensions (default)
curl -X PUT "http://localhost:6333/collections/project_examples" \
-H "Content-Type: application/json" \
-d '{
"vectors": {
"size": 768,
"distance": "Cosine"
}
}'
# Create error_examples collection with 768 dimensions (default)
curl -X PUT "http://localhost:6333/collections/error_examples" \
-H "Content-Type: application/json" \
-d '{
"vectors": {
"size": 768,
"distance": "Cosine"
}
}'
Note: If you've configured a different embedding size via
LLM_EMBED_SIZEenvironment variable, replace 768 with that value.
๐๏ธ Adding Data to Vector Collections
Method 1: Using Python API Directly
For Project Examples
from app.llm_client import LlamaEdgeClient
from app.vector_store import QdrantStore
# Initialize the components
llm_client = LlamaEdgeClient()
vector_store = QdrantStore()
# Ensure collection exists
vector_store.create_collection("project_examples")
# 1. Prepare your data
project_data = {
"query": "A command-line calculator in Rust",
"example": "Your full project example with code here...",
"project_files": {
"src/main.rs": "fn main() {\n println!(\"Hello, calculator!\");\n}",
"Cargo.toml": "[package]\nname = \"calculator\"\nversion = \"0.1.0\"\nedition = \"2021\"\n\n[dependencies]"
}
}
# 2. Get embedding for the query text
embedding = llm_client.get_embeddings([project_data["query"]])[0]
# 3. Add to vector database
vector_store.add_item(
collection_name="project_examples",
vector=embedding,
item=project_data
)
For Error Examples:
from app.llm_client import LlamaEdgeClient
from app.vector_store import QdrantStore
# Initialize the components
llm_client = LlamaEdgeClient()
vector_store = QdrantStore()
# Ensure collection exists
vector_store.create_collection("error_examples")
# 1. Prepare your error data
error_data = {
"error": "error[E0502]: cannot borrow `*self` as mutable because it is also borrowed as immutable",
"solution": "Ensure mutable and immutable borrows don't overlap by using separate scopes",
"context": "This error occurs when you try to borrow a value mutably while an immutable borrow exists",
"example": "// Before (error)\nfn main() {\n let mut v = vec![1, 2, 3];\n let first = &v[0];\n v.push(4); // Error: cannot borrow `v` as mutable\n println!(\"{}\", first);\n}\n\n// After (fixed)\nfn main() {\n let mut v = vec![1, 2, 3];\n {\n let first = &v[0];\n println!(\"{}\", first);\n } // immutable borrow ends here\n v.push(4); // Now it's safe to borrow mutably\n}"
}
# 2. Get embedding for the error message
embedding = llm_client.get_embeddings([error_data["error"]])[0]
# 3. Add to vector database
vector_store.add_item(
collection_name="error_examples",
vector=embedding,
item=error_data
)
Method 2: Adding Multiple Examples from JSON Files
First, ensure you have the required dependencies:
pip install qdrant-client openai
Place JSON files in the appropriate directories:
- Project examples:
data/project_examples - Error examples:
data/error_examples
Format for project examples (with optional project_files field):
{
"query": "Description of the project",
"example": "Full example code or description",
"project_files": {
"src/main.rs": "// File content here",
"Cargo.toml": "// File content here"
}
}
Format for error examples:
{
"error": "Rust compiler error message",
"solution": "How to fix the error",
"context": "Additional explanation (optional)",
"example": "// Code example showing the fix (optional)"
}
Then run the data loading script:
python -c "from app.load_data import load_project_examples, load_error_examples; load_project_examples(); load_error_examples()"
Method 3: Using the parse_and_save_qna.py Script
For bulk importing from a Q&A format text file:
Place your Q&A pairs in a text file with format similar to QnA_pair.txt
Modify the parse_and_save_qna.py script to point to your file.
Run the script:
python parse_and_save_qna.py
โ๏ธ Environment Variables for Vector Search
The SKIP_VECTOR_SEARCH environment variable controls whether the system uses vector search:
SKIP_VECTOR_SEARCH=true- Disables vector search functionalitySKIP_VECTOR_SEARCH=false(or not set) - Enables vector search
By default, vector search is disabled. To enable it:
- Change to
SKIP_VECTOR_SEARCH=falsein your.envfile - Ensure you have a running Qdrant instance (via Docker Compose or standalone)
- Create the collections as shown above
๐ค Contributing
Contributions are welcome! This project uses the Developer Certificate of Origin (DCO) to certify that contributors have the right to submit their code. Follow these steps:
- Fork the repository
- Create your feature branch
git checkout -b feature/amazing-feature - Make your changes
- Commit your changes with a sign-off
git commit -s -m 'Add some amazing feature' - Push to the branch
git push origin feature/amazing-feature - Open a Pull Request
The -s flag will automatically add a signed-off-by line to your commit message:
Signed-off-by: Your Name <your.email@example.com>
This certifies that you wrote or have the right to submit the code you're contributing according to the Developer Certificate of Origin.
๐ License
Licensed under GPLv3.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file iflow_mcp_cardea_mcp_rustcoder-0.1.0.tar.gz.
File metadata
- Download URL: iflow_mcp_cardea_mcp_rustcoder-0.1.0.tar.gz
- Upload date:
- Size: 22.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.10.0 {"installer":{"name":"uv","version":"0.10.0","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b2acbc39761b8d6d9e70b26e04aa3c7d157be6a39b7cb9a2a4b89cef224d3000
|
|
| MD5 |
d6e95cb249e3ed05cfb162303e1c4969
|
|
| BLAKE2b-256 |
3dfd2f0fb86a78682f0adaf2f5eccb923db0e81ac3dec2cee4cba25e034f528e
|
File details
Details for the file iflow_mcp_cardea_mcp_rustcoder-0.1.0-py3-none-any.whl.
File metadata
- Download URL: iflow_mcp_cardea_mcp_rustcoder-0.1.0-py3-none-any.whl
- Upload date:
- Size: 27.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.10.0 {"installer":{"name":"uv","version":"0.10.0","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b889c174e6131ed7b2a34754f4468232a6b7d753996010a0db06c808a9945f7b
|
|
| MD5 |
e6b623872a195cca453ce1acefd93e9a
|
|
| BLAKE2b-256 |
08856434068ccbd115b1805c80fb92679560aadfb350a209172a8af9eb455dec
|