A vector similarity search engine for humans🥳
Project description
VSSLite
A vector similarity search engine for humans🥳
🎁 Install
$ pip install vsslite
✨ Features
VSSLite provides a user-friendly interface for langchain and sqlite-vss.
🧩 Start API server
$ export OPENAI_APIKEY="YOUR_API_KEY"
$ python -m vsslite
Or
import uvicorn
from vsslite import LangChainVSSLiteServer
app = LangChainVSSLiteServer(YOUR_API_KEY).app
uvicorn.run(app, host="127.0.0.1", port=8000)
Go http://127.0.0.1:8000/docs to know the details and try it out.
🔍 Search
from vsslite import LangChainVSSLiteClient
# Initialize
vss = LangChainVSSLiteClient()
# Add data with embeddings
vss.add("The difference between eel and conger eel is that eel is more expensive.")
vss.add("Red pandas are smaller than pandas, but when it comes to cuteness, there is no \"lesser\" about them.")
vss.add("There is no difference between \"Ohagi\" and \"Botamochi\" themselves; they are used interchangeably depending on the season.")
# Search
print(vss.search("fish", count=1))
print(vss.search("animal", count=1))
print(vss.search("food", count=1))
Now you can get these search results.
$ python run.py
[{'page_content': 'The difference between eel and conger eel is that eel is more expensive.', 'metadata': {'source': 'inline'}}]
[{'page_content': 'Red pandas are smaller than pandas, but when it comes to cuteness, there is no "lesser" about them.', 'metadata': {'source': 'inline'}}]
[{'page_content': 'There is no difference between "Ohagi" and "Botamochi" themselves; they are used interchangeably depending on the season.', 'metadata': {'source': 'inline'}}]
🔧 Data management (Add, Get, Update, Delete)
Helps CRUD.
# Add
id = vss.add("The difference between eel and conger eel is that eel is more expensive.")[0]
# Get
vss.get(id)
# Update
vss.update(id, "The difference between eel and conger eel is that eel is more expensive. Una-jiro is cheaper than both of them.")
# Delete
vss.delete(id)
# Delete all
vss.delete_all()
Upload data. Accept Text, PDF, CSV and JSON for now.
vss.upload("path/to/data.json")
🍻 Asynchronous
Use async methods when you use VSSLite in server apps.
await vss.aadd("~~~")
await vss.aupdate(id, "~~~")
await vss.aget(id)
await vss.adelete(id)
await vss.aupdate_all()
await vss.asearch("~~~")
await vss.aupload("~~~")
🧇 Namespace
VSSLite supports namespaces for dividing the set of documents to search or update.
vss = LangChainVSSLiteClient()
# Search product documents
vss.search("What is the difference between super size and ultra size?", namespace="product")
# Search company documents
vss.search("Who is the CTO of Unagiken?", namespace="company")
🌐 Web UI
You can quickly launch a Q&A web service based on documents 🚅
Install dependency
$ pip install streamlit
$ pip install streamlit-chat
Make a script
This is an example for OpenAI terms of use (upload terms of use to VSSServer with namespace openai
).
Save this script as runui.py
.
import asyncio
from vsslite.chat import (
ChatUI,
VSSQAFunction
)
# Setup QA function
openai_qa_func = VSSQAFunction(
name="get_openai_terms_of_use",
description="Get information about terms of use of OpenAI services including ChatGPT.",
parameters={"type": "object", "properties": {}},
namespace="openai",
# answer_lang="Japanese", # <- Uncomment if you want to get answer in Japanese
# is_always_on=True, # <- Uncomment if you want to always fire this function
verbose=True
)
# Start app
chatui = ChatUI(temperature=0.5, functions=[openai_qa_func])
asyncio.run(chatui.start())
Start UI
$ streamlit run runui.py
See https://docs.streamlit.io to know more about Streamlit.
💬 LINE Bot
You can quickly launch a LINE Bot based on documents 🛫
Install dependency
$ pip install aiohttp line-bot-sdk
Make a script
This is an example for OpenAI terms of use (upload terms of use to VSSServer with namespace openai
).
Save this script as line.py
.
import os
from vsslite.chatgpt_processor import VSSQAFunction
from vsslite.line import LineBotServer
# Setup QA function(s)
from vsslite.chatgpt_processor import VSSQAFunction
openai_qa_func = VSSQAFunction(
name="get_openai_terms_of_use",
description="Get information about terms of use of OpenAI services including ChatGPT.",
parameters={"type": "object", "properties": {}},
vss_url=os.getenv("VSS_URL") or "http://127.0.0.1:8000",
namespace="openai",
# answer_lang="Japanese", # <- Uncomment if you want to get answer in Japanese
# is_always_on=True, # <- Uncomment if you want to always fire this function
verbose=True
)
app = LineBotServer(
channel_access_token=YOUR_CHANNEL_ACCESS_TOKEN,
channel_secret=YOUR_CHANNEL_SECRET,
endpoint_path="/linebot", # <- Set "https://your_domain/linebot" to webhook url at LINE Developers
functions=[openai_qa_func]
).app
Start LINE Bot Webhook Server
$ uvicorn line:app --host 0.0.0.0 --port 8002
Set `https://your_domain/linebot`` to webhook url at LINE Developers.
🐳 Docker
If you want to start VSSLite API with chat console, use docker-compose.yml
in examples.
Set your OpenAI API Key in vsslite.env and execute the command below:
$ docker-compose -p vsslite --env-file vsslite.env up -d --build
Or, use Dockerfile to start each service separately.
$ docker build -t vsslite-api -f Dockerfile.api .
$ docker run --name vsslite-api --mount type=bind,source="$(pwd)"/vectorstore,target=/app/vectorstore -d -p 8000:8000 -e OPENAI_API_KEY=$OPENAI_API_KEY vsslite-api:latest
$ docker build -t vsslite-chat -f Dockerfile.chat .
$ docker run --name vsslite-chat -d -p 8001:8000 -e OPENAI_API_KEY=$OPENAI_API_KEY vsslite-chat:latest
🌊 Using Azure OpenAI Service
VSSLite supports Azure OpenAI Service👍
API Server
Use OpenAIEmbeddings
configured for Azure.
from langchain.embeddings import OpenAIEmbeddings
azure_embeddings = OpenAIEmbeddings(
openai_api_type="azure",
openai_api_base="https://your-endpoint.openai.azure.com/",
openai_api_version="2023-08-01-preview",
deployment="your-embeddings-deployment-name"
)
app = LangChainVSSLiteServer(
apikey=YOUR_API_KEY or os.getenv("OPENAI_API_KEY"),
persist_directory="./vectorstore",
chunk_size=500,
chunk_overlap=0,
embedding_function=azure_embeddings
).app
Chat UI
Create ChatUI
with Azure OpenAI Service configurations.
chatui = ChatUI(
apikey=YOUR_API_KEY or os.getenv("OPENAI_API_KEY"),
temperature=0.5,
functions=[openai_qa_func],
# Config for Azure OpenAI Service
api_type="azure",
api_base="https://your-endpoint.openai.azure.com/",
api_version="2023-08-01-preview",
engine="your-embeddings-deployment-name"
)
See also the examples.
🍪 Classic version (based on SQLite)
See v0.3.0 README
🥰 Special thanks
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
File details
Details for the file vsslite-0.6.1-py3-none-any.whl
.
File metadata
- Download URL: vsslite-0.6.1-py3-none-any.whl
- Upload date:
- Size: 21.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7ea888a634c7340d6d80db254bde9f54d77ba30ffe73b9f32b1362917384cd44 |
|
MD5 | 5ad144f9f7b4234b1084added6bab38b |
|
BLAKE2b-256 | 05ec9ee302a329832e257e6a240e1e691fa83b1fecc9633e047a95d8e1fb18dd |