Skip to main content

Full-stack LLM chat interface with Model Context Protocol (MCP) integration

Project description

Atlas UI 3

CI/CD Pipeline Security Checks Docker Image PyPI version Python 3.11+ React 19 License MIT

Atlas UI 3 is a secure chat application with MCP (Model Context Protocol) integration, developed by Sandia National Laboratories -- a U.S. Department of Energy national laboratory -- to support U.S. Government customers.

Screenshot

About the Project

Atlas UI 3 is a full-stack LLM chat interface that supports multiple AI models, including those from OpenAI, Anthropic, and Google. Its core feature is the integration with the Model Context Protocol (MCP), which allows the AI assistant to connect to external tools and data sources, enabling complex, real-time workflows.

Features

  • Multi-LLM Support: Connect to various LLM providers.
  • MCP Integration: Extend the AI's capabilities with custom tools.
  • RAG Support: Enhance responses with Retrieval-Augmented Generation.
  • Secure and Configurable: Features group-based access control, compliance levels, and a tool approval system.
  • Modern Stack: Built with React 19, FastAPI, and WebSockets.
  • Python Package: Install and use as a library or CLI tool.

Installation

Install from PyPI (Recommended for Users)

# Install the package
pip install atlas-chat

# Or with uv (faster)
uv pip install atlas-chat

CLI Usage

After installation, three CLI tools are available:

# Set up configuration (run this first!)
atlas-init              # Creates .env and config/ in current directory
atlas-init --minimal    # Creates just a minimal .env file

# Chat with an LLM
atlas-chat "Hello, how are you?"
atlas-chat "What is 2654687621*sqrt(2)?" --tools calculator_evaluate
atlas-chat --list-tools
atlas-chat --list-models

# Start the web server
atlas-server --port 8000
atlas-server --env /path/to/.env --config-folder /path/to/config

Python API Usage

import asyncio
from atlas import AtlasClient

async def main():
    client = AtlasClient()

    # Simple chat
    result = await client.chat("Hello, how are you?")
    print(result.message)

    # Use the calculator MCP tool (tool_choice_required forces tool use)
    result = await client.chat(
        "What is 1234 * 5678?",
        selected_tools=["calculator_evaluate"],
        tool_choice_required=True,
    )
    print(result.message)

    await client.cleanup()

asyncio.run(main())

Synchronous usage:

from atlas import AtlasClient

client = AtlasClient()
result = client.chat_sync("Hello!")
print(result.message)

Quick Start (Development)

Prerequisites

# Install uv package manager (one-time)
curl -LsSf https://astral.sh/uv/install.sh | sh

# Create virtual environment and install in editable mode (with dev dependencies)
uv venv && source .venv/bin/activate
uv pip install -e ".[dev]"

This installs the atlas package in editable mode, meaning:

  • All dependencies are installed from pyproject.toml (the single source of truth)
  • The atlas package is importable everywhere without needing PYTHONPATH
  • Edit any Python file in atlas/ and changes take effect immediately
  • CLI commands (atlas-chat, atlas-server, atlas-init) are available
  • Dev tools (pytest, ruff, podman-compose) are included

Alternative: PYTHONPATH (if you can't use editable install)

# Set PYTHONPATH manually when running
PYTHONPATH=/path/to/atlas-ui-3 python atlas/main.py

Local Experimentation and MCP Testing

If you cloned the repo and want to run tests, experiment locally, or test MCP servers, sync the dev dependencies:

uv sync --dev

This installs pytest, ruff, and other development tools into your virtual environment.

Running the Application

Linux/macOS:

bash agent_start.sh

Windows:

.\ps_agent_start.ps1

Note for Windows users: If you encounter frontend build errors related to Rollup dependencies, delete frontend/package-lock.json and frontend/node_modules, then run the script again.

Both scripts automatically detect and work with Docker or Podman. The agent_start.sh script builds the frontend, starts necessary services, and launches the backend server.

Documentation

We have created a set of comprehensive guides to help you get the most out of Atlas UI 3.

  • Getting Started: The perfect starting point for all users. This guide covers how to get the application running with Docker or on your local machine.

  • Administrator's Guide: For those who will deploy and manage the application. This guide details configuration, security settings, access control, and other operational topics.

  • Developer's Guide: For developers who want to contribute to the project. It provides an overview of the architecture and instructions for creating new MCP servers.

Docker / Podman

Quick Start

# 1. Set up local config (copies defaults from atlas/config/)
atlas-init
# Edit .env to add your API keys

# 2. Build the image
podman build -t atlas-ui-3 .

# 3. Run with your local config mounted
podman run -p 8000:8000 \
  -v $(pwd)/config:/app/config:Z \
  --env-file .env \
  atlas-ui-3

The container seeds /app/config from package defaults at build time. Mounting your local config/ folder overrides those defaults, so you can customize llmconfig.yml, mcp.json, etc. without rebuilding.

Container Images

Pre-built container images are available at quay.io/agarlan-snl/atlas-ui-3:latest (pushes automatically from main branch).

For AI Agent Contributors

If you are an AI agent working on this repository, please refer to the following documents for the most current and concise guidance:

License

Copyright 2025 National Technology & Engineering Solutions of Sandia, LLC (NTESS). Under the terms of Contract DE-NA0003525 with NTESS, the U.S. Government retains certain rights in this software

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

atlas_chat-0.1.3.tar.gz (1.6 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

atlas_chat-0.1.3-py3-none-any.whl (1.7 MB view details)

Uploaded Python 3

File details

Details for the file atlas_chat-0.1.3.tar.gz.

File metadata

  • Download URL: atlas_chat-0.1.3.tar.gz
  • Upload date:
  • Size: 1.6 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for atlas_chat-0.1.3.tar.gz
Algorithm Hash digest
SHA256 d854e2718b3652ea0a9a16d547e1d8ad5a2e77e11e816cd441b87cc5046c4298
MD5 0b9c619a1aafe5090f25f6a364c57bfb
BLAKE2b-256 9dd161e130fb5f4021306da31e0850dbdc988134c5764ac1213125568c209a01

See more details on using hashes here.

File details

Details for the file atlas_chat-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: atlas_chat-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 1.7 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for atlas_chat-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 0ad6a1597858443e4848b8faad143d8c07e61fd91686a686b620f5ab26393d56
MD5 ae7911a241604818113a129831326e19
BLAKE2b-256 3029b8ec7a3cf1caf994bbb7d11f80b13305d2420c5b014e2d975ca4e33d0474

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page