A lightweight UI for smolagents
Project description
A lightweight web UI for 🤗smolagents.
🆕Recent Updates
- v0.1.0 (Dec 11, 2025): First release!
📑Table of Contents
✨Overview
🤗Smolagents is a flexible and powerful framework for building AI agents powered by large language models (LLMs). This repo aims to provide a friendly web App for both developers and end users.
| Features | Support |
|---|---|
| Chat history | :white_check_mark: persistent storage with local database |
| Image and dataset | :white_check_mark: Displays images, dataframes, and complex objects |
💿Installation
The Python package is available on PyPI.
pip install smolagentsUI
🚀Quick Start
In this demo, we build a Code Agent to analyze the public breast cancer dataset. The agent has access to a Python interpreter. All actions are executed via Python code. The agent can load a dataset from disk, perform exploratory data analysis, build machine learning models, and visualize results. For more information about Code Agents, please refer to the Smolagents documentation.
We first define a custom tool that loads a dataset as a pandas DataFrame. The agent will use this tool to access the breast cancer dataset.
from smolagents import Tool
import pandas as pd
class DataLoaderTool(Tool):
name = "data_loader"
description = """ Get breast cancer dataset as pandas.DataFrame. """
inputs = {}
output_type = "object"
def __init__(self, df: pd.DataFrame):
super().__init__()
self.df = df.copy()
def forward(self) -> pd.DataFrame:
return self.df
We then create an instance of the tool with the locally stored breast cancer dataset.
df = pd.read_csv('./demo/data/breast-cancer.data.csv')
data_loader_tool = DataLoaderTool(df=df)
Next, we prepare a model instance using a locally served gpt-oss-120b as the LLM backend.
from smolagents import CodeAgent, OpenAIModel
model = OpenAIModel(model_id="openai/gpt-oss-120b",
api_key="",
api_base="http://localhost:8000/v1")
We create a Code Agent with the data loader tool. For data analysis tasks, we authorize additional Python libraries such as pandas, numpy, sklearn, tableone, matplotlib, and PIL for the agent to use. We add some additional instructions to the agent's system prompt to ensure that all outputs are returned via the final_answer function. Adjust this per your specific use case.
instructions = """
Specific Instructions:
1. Do not save any files to disk. All outputs should be returned via `final_answer` function which is the ONLY way users can see your outputs.
2. Users might not see your intermediate reasoning steps, so make sure to explain your thoughts clearly in the `final_answer` function.
3. If your output is an object,
- it is highly encouraged to pass a List to the `final_answer` function with a friendly and helpful explanatory text and the requested output (e.g., Markdown text, Dict, PIL iamge, matplotlib image, pandas dataframe...), for example, `final_answer(["<Your explanation and thoughts in Markdown>", df.head(), img])`
- always check your output object by printing its type and content summary before passing to `final_answer` function to avoid errors. For example, you can use `print(type(your_object))` and `print(your_object)` to check the type and content of your output object.
4. Communication is key. If you need clarification or more information from the user, ask clarifying questions via the `final_answer` function before taking actions.
5. If the task requires writing long code. Do not try to write the whole code at once. Instead, break down the code into smaller snippets, functions, or classes and implement them one by one, testing each part before moving on to the next. This is to avoid overwhelming the execution environment and causing memory issues.
"""
agent = CodeAgent(tools=[data_loader_tool],
model=model, executor_type='local',
additional_authorized_imports = ["pandas", "numpy.*", "tableone", "scipy", "scipy.*", "sklearn", "sklearn.*", "statsmodels", "statsmodels.*", "matplotlib", "matplotlib.*", "PIL", "PIL.*"],
instructions=instructions,
stream_outputs=True)
Now, we start the web UI server with a persistent chat history storage (SQLite database file). If this is the first time, a SQLite database file will be created in the specified path. If the storage_path parameter is omitted, the chat history will be stored in memory only (non-persistent).
import smolagentsUI
# Create or load chat history from the specified SQLite database file
smolagentsUI.serve(agent, host="0.0.0.0", port=5000, storage_path="./chat_history/mychat.db")
# For in-memory chat history (non-persistent), leave out `storage_path` parameter
# smolagentsUI.serve(agent, host="0.0.0.0", port=5000)
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file smolagentsui-0.1.2.tar.gz.
File metadata
- Download URL: smolagentsui-0.1.2.tar.gz
- Upload date:
- Size: 23.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.11.8 Linux/6.8.0-88-generic
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8a4b97e8e0246cc29c52c8b5bf4d546debe1177a9a4c832153a7da0b084539cd
|
|
| MD5 |
fdce2722ec92e8c768387cb07ae6a7a5
|
|
| BLAKE2b-256 |
3e5eafd8f3ae0e79105956bc3eb9cf5ba59e662e6a406a9a0c8c90a7150be607
|
File details
Details for the file smolagentsui-0.1.2-py3-none-any.whl.
File metadata
- Download URL: smolagentsui-0.1.2-py3-none-any.whl
- Upload date:
- Size: 23.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.11.8 Linux/6.8.0-88-generic
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c6d45488b40f099fc003ec00f41993d13dc4efcc8c3522cd6e0236f05de68ca2
|
|
| MD5 |
d3b048c941403f219d078734cd9aa159
|
|
| BLAKE2b-256 |
f88c5b77ac218ff66368c98a99644b1981794d20bbb1a9a9ce90ce19a54e7801
|