Full-stack LLM chat interface with Model Context Protocol (MCP) integration
Project description
Atlas UI 3
Atlas UI 3 is a secure chat application with MCP (Model Context Protocol) integration, developed by Sandia National Laboratories -- a U.S. Department of Energy national laboratory -- to support U.S. Government customers.
About the Project
Atlas UI 3 is a full-stack LLM chat interface that supports multiple AI models, including those from OpenAI, Anthropic, and Google. Its core feature is the integration with the Model Context Protocol (MCP), which allows the AI assistant to connect to external tools and data sources, enabling complex, real-time workflows.
Features
- Multi-LLM Support: Connect to various LLM providers.
- MCP Integration: Extend the AI's capabilities with custom tools.
- RAG Support: Enhance responses with Retrieval-Augmented Generation.
- Secure and Configurable: Features group-based access control, compliance levels, and a tool approval system.
- Modern Stack: Built with React 19, FastAPI, and WebSockets.
- Python Package: Install and use as a library or CLI tool.
Installation
Install from PyPI (Recommended for Users)
# Install the package
pip install atlas-chat
# Or with uv (faster)
uv pip install atlas-chat
CLI Usage
After installation, three CLI tools are available:
# Set up configuration (run this first!)
atlas-init # Creates .env and config/ in current directory
atlas-init --minimal # Creates just a minimal .env file
# Chat with an LLM
atlas-chat "Hello, how are you?" --model gpt-4o
atlas-chat "Use the search tool" --tools server_tool1
atlas-chat --list-tools
# Start the server
atlas-server --port 8000
atlas-server --env /path/to/.env --config-folder /path/to/config
Python API Usage
from atlas import AtlasClient, ChatResult
# Async usage
client = AtlasClient()
result = await client.chat("Hello, how are you?")
print(result.message)
# With options
result = await client.chat(
"Analyze this data",
model="gpt-4o",
selected_tools=["calculator", "search"],
agent_mode=True
)
# Sync wrapper
result = client.chat_sync("Hello!")
Quick Start (Development)
Prerequisites
# Install uv package manager (one-time)
curl -LsSf https://astral.sh/uv/install.sh | sh
# Create virtual environment and install dependencies
uv venv && source .venv/bin/activate
uv pip install -r requirements.txt
Development Installation (Editable Mode)
For development, install the package in editable mode. This creates a link from your Python environment to your local source code, so any changes you make to the code are immediately available without reinstalling.
# Install in editable mode with uv (recommended)
uv pip install -e .
# Or with pip
pip install -e .
What editable mode gives you:
- Edit any Python file in
atlas/and changes take effect immediately - CLI commands (
atlas-chat,atlas-server) use your local code - Import
from atlas import AtlasClientin scripts and get your local version - No need to reinstall after making changes
Example workflow:
# Install once in editable mode
uv pip install -e .
# Edit code
vim atlas/atlas_client.py
# Run immediately with your changes - no reinstall needed
atlas-chat "test my changes"
python my_script.py # uses updated AtlasClient
Alternative: PYTHONPATH (if you can't use editable install)
# Set PYTHONPATH manually when running
PYTHONPATH=/path/to/atlas-ui-3 python atlas/main.py
Running the Application
Linux/macOS:
bash agent_start.sh
Windows:
.\ps_agent_start.ps1
Note for Windows users: If you encounter frontend build errors related to Rollup dependencies, delete frontend/package-lock.json and frontend/node_modules, then run the script again.
Both scripts automatically detect and work with Docker or Podman. The agent_start.sh script builds the frontend, starts necessary services, and launches the backend server.
Documentation
We have created a set of comprehensive guides to help you get the most out of Atlas UI 3.
-
Getting Started: The perfect starting point for all users. This guide covers how to get the application running with Docker or on your local machine.
-
Administrator's Guide: For those who will deploy and manage the application. This guide details configuration, security settings, access control, and other operational topics.
-
Developer's Guide: For developers who want to contribute to the project. It provides an overview of the architecture and instructions for creating new MCP servers.
Container Images
Pre-built container images are available at quay.io/agarlan-snl/atlas-ui-3:latest (pushes automatically from main branch).
For AI Agent Contributors
If you are an AI agent working on this repository, please refer to the following documents for the most current and concise guidance:
- CLAUDE.md: Detailed architecture, workflows, and conventions.
- GEMINI.md: Gemini-specific instructions.
- .github/copilot-instructions.md: A compact guide for getting productive quickly.
License
Copyright 2025 National Technology & Engineering Solutions of Sandia, LLC (NTESS). Under the terms of Contract DE-NA0003525 with NTESS, the U.S. Government retains certain rights in this software
MIT License
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file atlas_chat-0.1.0.tar.gz.
File metadata
- Download URL: atlas_chat-0.1.0.tar.gz
- Upload date:
- Size: 519.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b2ac9932d686441869b74ad28a73b2cd7f08b48c35227aa5f59ef9fb35729986
|
|
| MD5 |
136ce7b9711ee2f2cd4c949fd177beca
|
|
| BLAKE2b-256 |
c30a4e13c5ac9850294d502647a292ff239b9a04534e51bbfe06c5784a930aed
|
File details
Details for the file atlas_chat-0.1.0-py3-none-any.whl.
File metadata
- Download URL: atlas_chat-0.1.0-py3-none-any.whl
- Upload date:
- Size: 611.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
145c9735048be77ceb4917966be19d59f922544d4376f3e317ad2d9322278b3b
|
|
| MD5 |
c60a04f703ca0132f3573dcbd8b1c169
|
|
| BLAKE2b-256 |
0e1a97de8e8f2286b300bf6edbc2216e1817d336fc3c551adfc3d4099b930b29
|