Skip to main content

A sophisticated AI agent toolkit supporting multiple AI providers with tool calling capabilities, enterprise state management, HITL, Graph routing, LangChain support, and structured outputs.

Project description

nixagent

Python 3.8+ License: MIT AI Skill

A generic, multipurpose nixagent library in Python. This framework is completely agnostic to specific use cases and architectures, serving as a robust foundation for building autonomous, collaborative AI agents that can manage their own context, interface with each other, and securely use external tools.

๐Ÿš€ Quick Start

Installation

pip install -r requirements.txt

Command Line Usage

First, set up your environment configuration by copying .env.example to .env and adding your API keys.

# Ask a question directly
python app.py "What files are in the current directory?"

# Interactive mode
python app.py

# With custom settings
python app.py "Analyze the code structure" --no-save

Python Library Usage

from nixagent import Agent

# Initialize the core agent
agent = Agent(
    name="MainAgent",
    system_prompt="You are a highly capable AI assistant that uses available tools to accomplish goals."
)

result = agent.run(user_prompt="List all Python files in the project")
print(result)

โœจ Features

  • ๐ŸŒ Standardized API Interface: Uses pure requests following the OpenAI native JSON structure. Compatible with OpenAI, Vertex, Local LLMs (via Ollama/vLLM), Groq, and more.
  • ๐Ÿค– Autonomous Agents: Agents maintain independent conversation histories and automatically delegate sub-tasks when needed.
  • ๐Ÿ”Œ Model Context Protocol (MCP): Dynamic tool extension via MCP Servers via .mcp.json.
  • ๐Ÿ› ๏ธ Rich Built-In Tools: Deep system-level tools covering regex-based file searching, exact content mapping, disk manipulation, and secure subprocess execution.
  • ๐Ÿ—ฃ๏ธ Inter-Agent Collaboration: Support for multiple sub-agents operating concurrently under the same framework via .register_collaborator(agent).

๐Ÿ“ฆ Project Structure

framework/
โ”œโ”€โ”€ app.py                # Main CLI application
โ”œโ”€โ”€ nixagent/             # Core Framework Mechanics
โ”‚   โ”œโ”€โ”€ __init__.py       # Library exports
โ”‚   โ”œโ”€โ”€ agent.py          # Core contextual autonomous Agent
โ”‚   โ”œโ”€โ”€ llm.py            # Central HTTP-based LLM orchestration
โ”‚   โ”œโ”€โ”€ logger.py         # Central system execution logger
โ”‚   โ”œโ”€โ”€ mcp.py            # Model Context Protocol definition and bindings
โ”‚   โ”œโ”€โ”€ providers/        # LLM Vendor specific HTTP adapters
โ”‚   โ”‚   โ”œโ”€โ”€ openai.py
โ”‚   โ”‚   โ”œโ”€โ”€ anthropic.py
โ”‚   โ”‚   โ”œโ”€โ”€ gemini.py
โ”‚   โ”‚   โ”œโ”€โ”€ vertex.py
โ”‚   โ””โ”€โ”€ tools/            # Default Native Tools
โ”‚       โ”œโ”€โ”€ __init__.py   # Tool bindings & descriptions
โ”‚       โ”œโ”€โ”€ cmd.py        # Subprocess shell extensions
โ”‚       โ””โ”€โ”€ fs.py         # File system native operations
โ”œโ”€โ”€ mcp.json              # Model Context Protocol Server mapping
โ”œโ”€โ”€ docs/                 # Additional Documentation
โ”œโ”€โ”€ requirements.txt      # Python dependencies
โ”œโ”€โ”€ .env                  # Operational mapping variables
โ””โ”€โ”€ README.md             # This file

โš™๏ธ Configuration

Create a .env file in your project root:

# LLM Provider (openai, anthropic, gemini, or vertex)
PROVIDER=openai

# OpenAI Configuration
OPENAI_API_KEY=your_api_key_here
OPENAI_BASE_URL=https://api.openai.com/v1
OPENAI_MODEL=gpt-4o

# Anthropic Configuration
ANTHROPIC_API_KEY=your_anthropic_api_key_here
ANTHROPIC_BASE_URL=https://api.anthropic.com/v1
ANTHROPIC_MODEL=claude-3-opus-20240229

# Gemini Configuration
GEMINI_API_KEY=your_gemini_api_key_here
GEMINI_BASE_URL=https://generativelanguage.googleapis.com/v1beta/openai
GEMINI_MODEL=gemini-2.5-flash

# Vertex AI Configuration
VERTEX_API_KEY=your_vertex_api_key_here
VERTEX_BASE_URL=https://aiplatform.googleapis.com/v1
VERTEX_MODEL=gemini-2.5-flash-lite

# Tool and Processing Configuration
MAX_ITERATIONS=25

# Logging Configuration
LOG_LEVEL=INFO
LOG_FILE=agent.log  # (Optional) Route all agent tool execution traces to this file instead of stdout

๐Ÿ”Œ Using MCP Servers

Add server definitions to your mcp.json file in the root directory:

{
  "mcpServers": {
    "sqlite": {
      "command": "uvx",
      "args": ["mcp-server-sqlite", "--db-path", "./database.db"],
      "active": true
    }
  }
}

The framework's MCPManager automatically bootstraps all active MCP servers, parses their schemas, and loads their tools natively alongside standard tools upon Agent initialization.

๐Ÿค Collaborative Agents

Agents can securely establish communication networks.

from nixagent import Agent

research_agent = Agent("Researcher", "You perform file system research.")
writer_agent = Agent("Writer", "You answer questions accurately.")

writer_agent.register_collaborator(research_agent)

writer_agent.run("Ask the Researcher to find all text files and read them to me.")

๐Ÿค– AI Skill

nixagent is available as an installable AI coding agent skill โ€” giving your AI assistant full knowledge of the framework, its API, and usage patterns.

Install the Skill

npx skills add technicalheist/nixagent

This installs the skill into your project's agent directories (Cursor, Copilot, Cline, Antigravity, and more) so your AI assistant can immediately understand and use nixagent without extra explanation.

What's Included

The skill (skills/nixagent/SKILL.md) contains:

  • Full installation & environment setup guide
  • Documentation for all 7 core features
  • Provider-specific usage (OpenAI, Anthropic, Gemini, Vertex)
  • Code reference examples in skills/nixagent/examples/

Find the Skill

GitHub technicalheist/nixagent โ€บ skills/nixagent
Skill Registry skills.sh

๐Ÿ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nixagent-1.23.tar.gz (33.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nixagent-1.23-py3-none-any.whl (37.1 kB view details)

Uploaded Python 3

File details

Details for the file nixagent-1.23.tar.gz.

File metadata

  • Download URL: nixagent-1.23.tar.gz
  • Upload date:
  • Size: 33.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.0

File hashes

Hashes for nixagent-1.23.tar.gz
Algorithm Hash digest
SHA256 7b9e03659ebd8f124d7c5b6dc68822803776222c6740c6baed7cf8b72161c865
MD5 261c416c13188e8b7214a2d13eb3fb3f
BLAKE2b-256 f4b4196e1533c612f89fcd70fd58d6e5dea06f878eb61a1dee0e028a430f38c1

See more details on using hashes here.

File details

Details for the file nixagent-1.23-py3-none-any.whl.

File metadata

  • Download URL: nixagent-1.23-py3-none-any.whl
  • Upload date:
  • Size: 37.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.0

File hashes

Hashes for nixagent-1.23-py3-none-any.whl
Algorithm Hash digest
SHA256 cf7fbaf1b0fdc8573c0375476eba087e6311169a9800c47de419f7d0c0451c5c
MD5 1143d0abb83d5569f5abdfa15e960bf0
BLAKE2b-256 0402533d189b34ced890403d3c3a122a7a9476a0ee5b53bdfffdb78bc9aef5a7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page