An empathetic conversational AI companion for Octopus Sensing
Project description
Octopus Sensing SARA
An empathetic conversational AI companion with persistent memory, user profiling, and multi-LLM support. Part of the Octopus Sensing project ecosystem.
Built with FastAPI, LangChain, and React.
Features
- ๐ง Empathetic AI - Emotionally intelligent responses with advanced prompt engineering
- ๐พ Persistent Memory - SQLite storage for conversations and user profiles
- ๐ค User Profiling - Automatic extraction of preferences and personal information
- ๐ Multi-LLM Support - OpenAI, Anthropic (Claude), Google Gemini, and Ollama
- ๐ Auto-Summarization - Long-term conversation summaries for context retention
- ๐ Production-Ready - Docker deployment, type safety, async API
- ๐จ Modern UI - React frontend with shadcn/ui components
Quick Start
Prerequisites
- Docker and Docker Compose
- API key for your LLM provider (OpenAI, Anthropic, or Google)
Installation
-
Clone the repository
git clone https://github.com/octopus-sensing/octopus-sensing-sara.git cd octopus-sensing-sara
-
Configure environment
cp .env.example .env
-
Add your API key to
.env# Choose your provider LLM_PROVIDER=gemini LLM_MODEL=models/gemini-2.5-flash GOOGLE_API_KEY=your-api-key-here
-
Start the application
docker compose up --build
-
Access the application
- Frontend: http://localhost
- Backend API: http://localhost:8000
- API Docs: http://localhost:8000/docs
Configuration
Configure SARA by editing the .env file:
| Variable | Description | Example |
|---|---|---|
LLM_PROVIDER |
AI provider | openai, anthropic, gemini, ollama |
LLM_MODEL |
Model name | models/gemini-2.5-flash, gpt-4, claude-3-opus |
LLM_TEMPERATURE |
Response creativity (0-2) | 0.7 |
LLM_MAX_TOKENS |
Max response length | 2000 |
SHORT_TERM_MEMORY_WINDOW |
Recent messages to remember | 10 |
LONG_TERM_MEMORY_SUMMARY_THRESHOLD |
Messages before summarization | 50 |
LLM Provider Setup
Google Gemini (Free Tier)
LLM_PROVIDER=gemini
LLM_MODEL=models/gemini-2.5-flash
GOOGLE_API_KEY=your-google-api-key
OpenAI
LLM_PROVIDER=openai
LLM_MODEL=gpt-4
OPENAI_API_KEY=sk-your-openai-key
Anthropic Claude
LLM_PROVIDER=anthropic
LLM_MODEL=claude-3-5-sonnet-20241022
ANTHROPIC_API_KEY=sk-ant-your-anthropic-key
API Reference
Send Message
curl -X POST http://localhost:8000/chat \
-H "Content-Type: application/json" \
-d '{
"user_id": "user123",
"message": "Hello, I need help with something."
}'
Get User Profile
curl http://localhost:8000/user/user123/profile
Get Conversation History
curl http://localhost:8000/conversation/{session_id}
Full API documentation: http://localhost:8000/docs
Architecture
โโโโโโโโโโโโโโโโโโโ
โ React Frontend โ
โโโโโโโโโโฌโโโโโโโโโ
โ
โโโโโโโโโโผโโโโโโโโโ
โ FastAPI REST โ
โโโโโโโโโโฌโโโโโโโโโ
โ
โโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Conversation Service โ
โ โข Process messages โ
โ โข Extract user info โ
โ โข Manage memory & summarization โ
โโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ
โโโโโโโโโโผโโโโโโโโโฌโโโโโโโโโโโฌโโโโโโโโโโโ
โ Memory Service โ User โ LLM โ
โ โ Service โ Factory โ
โโโโโโโโโโฌโโโโโโโโโดโโโโโฌโโโโโโดโโโโโฌโโโโโโ
โ โ โ
โโโโโโโโโโผโโโโโโโโโโโโโโผโโโโโโโโโโโผโโโโโโ
โ SQLite Database (Persistent) โ
โ โข Conversations โข Users โข Messages โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Project Structure
octopus-sensing-sara/
โโโ server/
โ โโโ octopus_sensing_sara/
โ โ โโโ api/ # REST endpoints
โ โ โโโ core/ # Config, LLM factory
โ โ โโโ prompts/ # Modular prompt system
โ โ โ โโโ system_prompt/
โ โ โ โโโ summarization_prompt/
โ โ โ โโโ extraction_prompt/
โ โ โโโ models/ # Database & schemas
โ โ โโโ services/ # Business logic
โ โ โโโ storage/ # Database layer
โ โ โโโ main.py # FastAPI app
โ โ โโโ run.py # Entry point
โ โโโ tests/
โ โโโ Dockerfile
โ
โโโ octopus_sensing_sara_ui/
โ โโโ src/
โ โ โโโ components/ # UI components
โ โ โโโ pages/ # Page components
โ โโโ Dockerfile
โ
โโโ docker-compose.yml
โโโ pyproject.toml # Poetry configuration & dependencies
โโโ poetry.lock
How It Works
1. Empathetic Conversations
SARA uses advanced prompt engineering to:
- Recognize emotional states from user messages
- Validate feelings before offering solutions
- Adapt tone based on user's emotional context
- Maintain warm, supportive communication
2. Memory System
Short-Term Memory
- Last 10 messages kept in memory
- Provides immediate conversation context
- Cached per session
Long-Term Memory
- All conversations stored in SQLite
- User profiles with extracted facts
- Automatic summarization after 50 messages
3. User Profiling
SARA automatically extracts and remembers:
- Name and personal details
- Preferences and interests
- Conversation history
- Emotional patterns
This data personalizes future interactions.
4. Modular Prompts
Prompts are organized by function:
system_prompt/- SARA's core personalitysummarization_prompt/- Conversation summariesextraction_prompt/- User information extraction
Each uses LangChain templates for consistency.
Docker Commands
Start application
docker compose up -d
View logs
docker compose logs -f backend
Restart after changes
docker compose restart backend
Stop application
docker compose down
Clean database (fresh start)
./clean.sh
Development
Local Setup (without Docker)
Backend
# Using Poetry (recommended)
poetry install
poetry run python -m octopus_sensing_sara.run
# OR using venv + pip
cd server
python -m venv venv
source venv/bin/activate # Windows: venv\Scripts\activate
pip install -e .. # Install from pyproject.toml
python -m octopus_sensing_sara.run
Frontend
cd octopus_sensing_sara_ui
npm install
npm run dev
Adding a New LLM Provider
- Add API key field to
server/octopus_sensing_sara/core/config.py - Add provider logic to
server/octopus_sensing_sara/core/llm_factory.py - Update
docker-compose.ymlenvironment variables - Add to
.envfile
Tech Stack
FastAPI โข LangChain โข React โข SQLite โข Docker โข Poetry
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file octopus_sensing_sara-1.0.tar.gz.
File metadata
- Download URL: octopus_sensing_sara-1.0.tar.gz
- Upload date:
- Size: 40.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.2 CPython/3.12.3 Linux/6.8.0-48-generic
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3b231cf3d51846b4aa8605e17a0d744b782ee22921cbfa21f5392938984075c9
|
|
| MD5 |
ea9835a20e9c754382578ce273ebcb32
|
|
| BLAKE2b-256 |
b749947a1da5ca7c0ffeea31a9cd6e3319265c128a31fe5fdf8683123a9d87b7
|
File details
Details for the file octopus_sensing_sara-1.0-py3-none-any.whl.
File metadata
- Download URL: octopus_sensing_sara-1.0-py3-none-any.whl
- Upload date:
- Size: 53.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.2 CPython/3.12.3 Linux/6.8.0-48-generic
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6e6867858cf57c1fc19e88956052fc36931222c8f6f6b89ae09a709061b42cae
|
|
| MD5 |
667f516be930f5b0ff5a4eb22156869a
|
|
| BLAKE2b-256 |
c6f25d945aea84fdbb9d7c20e623c5d8361ea355277f5621295dbe2231810080
|