Skip to main content

An empathetic conversational AI companion for Octopus Sensing

Project description

Octopus Sensing SARA

An empathetic conversational AI companion with persistent memory, user profiling, and multi-LLM support. Part of the Octopus Sensing project ecosystem.

Built with FastAPI, LangChain, and React.


Features

  • ๐Ÿง  Empathetic AI - Emotionally intelligent responses with advanced prompt engineering
  • ๐Ÿ’พ Persistent Memory - SQLite storage for conversations and user profiles
  • ๐Ÿ‘ค User Profiling - Automatic extraction of preferences and personal information
  • ๐Ÿ”Œ Multi-LLM Support - OpenAI, Anthropic (Claude), Google Gemini, and Ollama
  • ๐Ÿ“ Auto-Summarization - Long-term conversation summaries for context retention
  • ๐Ÿš€ Production-Ready - Docker deployment, type safety, async API
  • ๐ŸŽจ Modern UI - React frontend with shadcn/ui components

Quick Start

Prerequisites

  • Docker and Docker Compose
  • API key for your LLM provider (OpenAI, Anthropic, or Google)

Installation

  1. Clone the repository

    git clone https://github.com/octopus-sensing/octopus-sensing-sara.git
    cd octopus-sensing-sara
    
  2. Configure environment

    cp .env.example .env
    
  3. Add your API key to .env

    # Choose your provider
    LLM_PROVIDER=gemini
    LLM_MODEL=models/gemini-2.5-flash
    GOOGLE_API_KEY=your-api-key-here
    
  4. Start the application

    docker compose up --build
    
  5. Access the application


Configuration

Configure SARA by editing the .env file:

Variable Description Example
LLM_PROVIDER AI provider openai, anthropic, gemini, ollama
LLM_MODEL Model name models/gemini-2.5-flash, gpt-4, claude-3-opus
LLM_TEMPERATURE Response creativity (0-2) 0.7
LLM_MAX_TOKENS Max response length 2000
SHORT_TERM_MEMORY_WINDOW Recent messages to remember 10
LONG_TERM_MEMORY_SUMMARY_THRESHOLD Messages before summarization 50

LLM Provider Setup

Google Gemini (Free Tier)

LLM_PROVIDER=gemini
LLM_MODEL=models/gemini-2.5-flash
GOOGLE_API_KEY=your-google-api-key

OpenAI

LLM_PROVIDER=openai
LLM_MODEL=gpt-4
OPENAI_API_KEY=sk-your-openai-key

Anthropic Claude

LLM_PROVIDER=anthropic
LLM_MODEL=claude-3-5-sonnet-20241022
ANTHROPIC_API_KEY=sk-ant-your-anthropic-key

API Reference

Send Message

curl -X POST http://localhost:8000/chat \
  -H "Content-Type: application/json" \
  -d '{
    "user_id": "user123",
    "message": "Hello, I need help with something."
  }'

Get User Profile

curl http://localhost:8000/user/user123/profile

Get Conversation History

curl http://localhost:8000/conversation/{session_id}

Full API documentation: http://localhost:8000/docs


Architecture

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚  React Frontend โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
         โ”‚
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ–ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚   FastAPI REST  โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
         โ”‚
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ–ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚   Conversation Service             โ”‚
โ”‚   โ€ข Process messages               โ”‚
โ”‚   โ€ข Extract user info              โ”‚
โ”‚   โ€ข Manage memory & summarization  โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
         โ”‚
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ–ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚ Memory Service  โ”‚  User    โ”‚   LLM    โ”‚
โ”‚                 โ”‚ Service  โ”‚ Factory  โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”˜
         โ”‚             โ”‚          โ”‚
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ–ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ–ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ–ผโ”€โ”€โ”€โ”€โ”€โ”
โ”‚     SQLite Database (Persistent)       โ”‚
โ”‚  โ€ข Conversations  โ€ข Users  โ€ข Messages  โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

Project Structure

octopus-sensing-sara/
โ”œโ”€โ”€ server/
โ”‚   โ”œโ”€โ”€ octopus_sensing_sara/
โ”‚   โ”‚   โ”œโ”€โ”€ api/              # REST endpoints
โ”‚   โ”‚   โ”œโ”€โ”€ core/             # Config, LLM factory
โ”‚   โ”‚   โ”œโ”€โ”€ prompts/          # Modular prompt system
โ”‚   โ”‚   โ”‚   โ”œโ”€โ”€ system_prompt/
โ”‚   โ”‚   โ”‚   โ”œโ”€โ”€ summarization_prompt/
โ”‚   โ”‚   โ”‚   โ””โ”€โ”€ extraction_prompt/
โ”‚   โ”‚   โ”œโ”€โ”€ models/           # Database & schemas
โ”‚   โ”‚   โ”œโ”€โ”€ services/         # Business logic
โ”‚   โ”‚   โ”œโ”€โ”€ storage/          # Database layer
โ”‚   โ”‚   โ”œโ”€โ”€ main.py           # FastAPI app
โ”‚   โ”‚   โ””โ”€โ”€ run.py            # Entry point
โ”‚   โ”œโ”€โ”€ tests/
โ”‚   โ””โ”€โ”€ Dockerfile
โ”‚
โ”œโ”€โ”€ octopus_sensing_sara_ui/
โ”‚   โ”œโ”€โ”€ src/
โ”‚   โ”‚   โ”œโ”€โ”€ components/       # UI components
โ”‚   โ”‚   โ””โ”€โ”€ pages/            # Page components
โ”‚   โ””โ”€โ”€ Dockerfile
โ”‚
โ”œโ”€โ”€ docker-compose.yml
โ”œโ”€โ”€ pyproject.toml              # Poetry configuration & dependencies
โ””โ”€โ”€ poetry.lock

How It Works

1. Empathetic Conversations

SARA uses advanced prompt engineering to:

  • Recognize emotional states from user messages
  • Validate feelings before offering solutions
  • Adapt tone based on user's emotional context
  • Maintain warm, supportive communication

2. Memory System

Short-Term Memory

  • Last 10 messages kept in memory
  • Provides immediate conversation context
  • Cached per session

Long-Term Memory

  • All conversations stored in SQLite
  • User profiles with extracted facts
  • Automatic summarization after 50 messages

3. User Profiling

SARA automatically extracts and remembers:

  • Name and personal details
  • Preferences and interests
  • Conversation history
  • Emotional patterns

This data personalizes future interactions.

4. Modular Prompts

Prompts are organized by function:

  • system_prompt/ - SARA's core personality
  • summarization_prompt/ - Conversation summaries
  • extraction_prompt/ - User information extraction

Each uses LangChain templates for consistency.


Docker Commands

Start application

docker compose up -d

View logs

docker compose logs -f backend

Restart after changes

docker compose restart backend

Stop application

docker compose down

Clean database (fresh start)

./clean.sh

Development

Local Setup (without Docker)

Backend

# Using Poetry (recommended)
poetry install
poetry run python -m octopus_sensing_sara.run

# OR using venv + pip
cd server
python -m venv venv
source venv/bin/activate  # Windows: venv\Scripts\activate
pip install -e ..  # Install from pyproject.toml
python -m octopus_sensing_sara.run

Frontend

cd octopus_sensing_sara_ui
npm install
npm run dev

Adding a New LLM Provider

  1. Add API key field to server/octopus_sensing_sara/core/config.py
  2. Add provider logic to server/octopus_sensing_sara/core/llm_factory.py
  3. Update docker-compose.yml environment variables
  4. Add to .env file

Tech Stack

FastAPI โ€ข LangChain โ€ข React โ€ข SQLite โ€ข Docker โ€ข Poetry

Project details


Release history Release notifications | RSS feed

This version

1.0

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

octopus_sensing_sara-1.0.tar.gz (40.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

octopus_sensing_sara-1.0-py3-none-any.whl (53.3 kB view details)

Uploaded Python 3

File details

Details for the file octopus_sensing_sara-1.0.tar.gz.

File metadata

  • Download URL: octopus_sensing_sara-1.0.tar.gz
  • Upload date:
  • Size: 40.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.12.3 Linux/6.8.0-48-generic

File hashes

Hashes for octopus_sensing_sara-1.0.tar.gz
Algorithm Hash digest
SHA256 3b231cf3d51846b4aa8605e17a0d744b782ee22921cbfa21f5392938984075c9
MD5 ea9835a20e9c754382578ce273ebcb32
BLAKE2b-256 b749947a1da5ca7c0ffeea31a9cd6e3319265c128a31fe5fdf8683123a9d87b7

See more details on using hashes here.

File details

Details for the file octopus_sensing_sara-1.0-py3-none-any.whl.

File metadata

  • Download URL: octopus_sensing_sara-1.0-py3-none-any.whl
  • Upload date:
  • Size: 53.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.12.3 Linux/6.8.0-48-generic

File hashes

Hashes for octopus_sensing_sara-1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 6e6867858cf57c1fc19e88956052fc36931222c8f6f6b89ae09a709061b42cae
MD5 667f516be930f5b0ff5a4eb22156869a
BLAKE2b-256 c6f25d945aea84fdbb9d7c20e623c5d8361ea355277f5621295dbe2231810080

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page