AI DevOps - a package to help deploy GenAI to the Cloud.
Project description
Sunholo Python Library
🚀 AI DevOps framework for building GenAI applications on Google Cloud Platform
Sunholo is a comprehensive Python framework that streamlines the development, deployment, and management of Generative AI applications (VACs - Virtual Agent Computers). It provides a configuration-driven approach with deep integration into Google Cloud services while supporting multiple AI providers.
🎯 What is Sunholo?
Sunholo helps you:
- 🤖 Build conversational AI agents with any LLM provider (Vertex AI, OpenAI, Anthropic, Ollama)
- ☁️ Deploy to Google Cloud Run with automatic scaling
- 🗄️ Use AlloyDB and Discovery Engine for vector storage and search
- 🔄 Handle streaming responses and async processing
- 📄 Process documents with chunking and embedding pipelines
- 🔧 Manage complex configurations with YAML files
- 🎨 Create APIs, web apps, and chat bots
🚀 Quick Start
Prerequisites
Install uv - a fast, modern Python package manager:
# macOS/Linux
curl -LsSf https://astral.sh/uv/install.sh | sh
# Windows
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
Installation
# Install with CLI tools (recommended)
uv tool install --from "sunholo[cli]" sunholo
# Install with all features including GCP
uv tool install --from "sunholo[cli]" sunholo --with "sunholo[all]"
Your First VAC
- Initialize a new project:
sunholo init my-ai-agent
cd my-ai-agent
- Configure your AI agent:
Edit
config/vac_config.yaml:
kind: vacConfig
apiVersion: v1
vac:
my-agent:
llm: vertex
model: gemini-1.5-pro
agent: simple
description: "My AI agent powered by Google Cloud"
- Chat with your agent locally:
sunholo vac chat my-agent
- Run your agent as a local Flask app:
sunholo deploy my-agent
📋 Features
Core Capabilities
- Multi-Model Support: Integrate Vertex AI, OpenAI, Anthropic, Ollama in one app
- Document Processing: Chunk, embed, and index documents with Discovery Engine
- Vector Databases: Native support for AlloyDB, LanceDB, Supabase
- Streaming: Real-time response streaming for chat applications
- Async Processing: Pub/Sub integration for background tasks
- Authentication: Built-in Google Cloud IAM and custom auth
Google Cloud Integration
- Vertex AI: Access Gemini, PaLM, and custom models
- AlloyDB: PostgreSQL-compatible vector database
- Discovery Engine: Enterprise search and RAG
- Cloud Run: Serverless deployment
- Cloud Storage: Document and file management
- Pub/Sub: Asynchronous message processing
- Cloud Logging: Centralized logging
Framework Support
- Web Frameworks: Flask and FastAPI templates
- AI Frameworks: LangChain and LlamaIndex integration
- MCP Integration: Model Context Protocol server and client support
- Observability: Langfuse for tracing and monitoring
- API Standards: OpenAI-compatible endpoints
🛠 Installation Options
Using uv
# Core CLI features
uv tool install --from "sunholo[cli]" sunholo
# With Google Cloud Platform integration
uv tool install --from "sunholo[cli]" sunholo --with "sunholo[gcp]"
# With specific LLM providers
uv tool install --from "sunholo[cli]" sunholo --with "sunholo[openai]"
uv tool install --from "sunholo[cli]" sunholo --with "sunholo[anthropic]"
# With database support
uv tool install --from "sunholo[cli]" sunholo --with "sunholo[database]"
# Everything
uv tool install --from "sunholo[cli]" sunholo --with "sunholo[all]"
Managing Installations
# Upgrade
uv tool upgrade sunholo
# List installed
uv tool list
# Uninstall
uv tool uninstall sunholo
Development Setup
# Clone repository
git clone https://github.com/sunholo-data/sunholo-py.git
cd sunholo-py
# Install in development mode
uv venv
uv pip install -e ".[all]"
# Run tests
pytest tests/
⚙️ Configuration
Sunholo uses YAML configuration files:
# config/vac_config.yaml
kind: vacConfig
apiVersion: v1
gcp_config:
project_id: my-gcp-project
location: us-central1
vac:
my-agent:
llm: vertex
model: gemini-1.5-pro
agent: langchain
memory:
- alloydb:
project_id: my-gcp-project
region: us-central1
cluster: my-cluster
instance: my-instance
tools:
- search
- calculator
🔧 CLI Commands
# Project Management
sunholo init <project-name> # Create new project from template
sunholo list-configs # List all configurations
sunholo list-configs --validate # Validate configurations
# Development
sunholo vac chat <vac-name> # Chat with a VAC locally
sunholo vac list # List available VACs
sunholo vac get-url <vac-name> # Get Cloud Run URL for a VAC
sunholo proxy start <service> # Start local proxy to cloud service
sunholo proxy list # List running proxies
sunholo deploy <vac-name> # Run Flask app locally
# Document Processing
sunholo embed <vac-name> # Process and embed documents
sunholo merge-text <folder> <output> # Merge files for context
# Cloud Services
sunholo discovery-engine create <name> # Create Discovery Engine instance
sunholo vertex list-extensions # List Vertex AI extensions
sunholo swagger <vac-name> # Generate OpenAPI spec
# Integration Tools
sunholo excel-init # Initialize Excel plugin
sunholo llamaindex <query> # Query with LlamaIndex
sunholo mcp list-tools # List MCP tools
sunholo tts <text> # Text-to-speech synthesis
📝 Examples
Chat with History Extraction
from sunholo.utils import ConfigManager
from sunholo.components import pick_llm
from sunholo.agents import extract_chat_history
config = ConfigManager('my-agent')
llm = pick_llm(config=config)
# Extract chat history from messages
chat_history = [
{"role": "user", "content": "Hello"},
{"role": "assistant", "content": "Hi there!"}
]
history_str = extract_chat_history(chat_history)
# Use in prompt
response = llm.invoke(f"Given this history:\n{history_str}\n\nUser: How are you?")
Document Processing with Chunker
from sunholo.chunker import direct_file_to_embed
from sunholo.utils import ConfigManager
config = ConfigManager('my-agent')
# Process a file directly
result = direct_file_to_embed(
"document.pdf",
embed_prefix="doc",
metadata={"source": "user_upload"},
vectorstore=config.vacConfig("vectorstore")
)
Vertex AI with Memory Tools
from sunholo.vertex import get_vertex_memories
from sunholo.utils import ConfigManager
config = ConfigManager('my-agent')
# Get Vertex AI memory configuration
memory_config = get_vertex_memories(config)
# Use with Vertex AI
if memory_config:
print(f"Memory tools configured: {memory_config}")
Streaming Response with Flask
from sunholo.agents import send_to_qa
from flask import Response, request
@app.route('/vac/streaming/<vac_name>', methods=['POST'])
def streaming_endpoint(vac_name):
question = request.json.get('user_input')
def generate():
# Stream responses from the QA system
response = send_to_qa(
question,
vac_name=vac_name,
stream=True
)
if hasattr(response, '__iter__'):
for chunk in response:
yield f"data: {chunk}\n\n"
else:
yield f"data: {response}\n\n"
return Response(generate(), content_type='text/event-stream')
Discovery Engine Integration
from sunholo.discovery_engine import DiscoveryEngineClient
# Initialize client
client = DiscoveryEngineClient(
project_id='my-project',
data_store_id='my-datastore'
)
# Search documents
results = client.search("What is Vertex AI?")
for result in results:
print(f"Content: {result.chunk.content}")
print(f"Score: {result.relevance_score}")
🧪 Testing
# Run all tests
pytest tests/
# Run specific test file
pytest tests/test_config.py
# Run with coverage
pytest --cov=src/sunholo tests/
# Run async tests
pytest tests/test_async_genai2.py
📚 Documentation
- 📖 Full Documentation: https://dev.sunholo.com/
- 🎓 Tutorials: https://dev.sunholo.com/docs/howto/
- 🤖 VAC Examples: https://github.com/sunholo-data/vacs-public
- 🎧 Audio Overview: Listen to the NotebookLM podcast
🤝 Contributing
We welcome contributions! See our Contributing Guidelines.
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature) - Commit your changes (
git commit -m 'Add AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
📜 License
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
Copyright [2024] [Holosun ApS]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
🙏 Support
- 📧 Email: multivac@sunholo.com
- 🐛 Issues: GitHub Issues
- 💬 Discussions: GitHub Discussions
- 📖 Documentation: https://dev.sunholo.com/
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file sunholo-0.145.3.tar.gz.
File metadata
- Download URL: sunholo-0.145.3.tar.gz
- Upload date:
- Size: 271.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.9.23
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9040ca3683c3f0232834d86052d2b2f4bb24af47edb4e111bad2eccf62323f04
|
|
| MD5 |
5aca93802eca01ba856ab40432f3b1f4
|
|
| BLAKE2b-256 |
e0149062880a97fdac1f7bd17215afd4ee28baabb3f49819dfbb0bf8c7818270
|
File details
Details for the file sunholo-0.145.3-py3-none-any.whl.
File metadata
- Download URL: sunholo-0.145.3-py3-none-any.whl
- Upload date:
- Size: 331.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.9.23
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f24d3840aca3190f6a48cd9d32ab6e89cae7f95c260b6c72fb2d47e0ec0b21cb
|
|
| MD5 |
be2472e7ebd9460bb887e655f0bf951b
|
|
| BLAKE2b-256 |
a142059127eaa19548738fd627e2f70b5a69946040ec928ab5fc3d7b6f6d5d21
|