This package contains the Cumin AI Python SDK. Cumin AI is a Managed LLM Frontier Knowledge.
Project description
cuminai
This package contains the Cumin AI Python SDK. Cumin AI is a Managed LLM Context Service. This package provides integration with Langchain.
Installation
pip install cuminai
Usage
The cuminai
class helps easily access the Cumin AI Context store.
# Setup API key
import os
from getpass import getpass
CUMINAI_API_KEY = getpass("Enter Cumin AI API Key: ")
os.environ["CUMINAI_API_KEY"] = CUMINAI_API_KEY
# Access Cumin AI Client
from cuminai import CuminAI
embedding = ... # use a LangChain Embeddings class
client = CuminAI(
source="<Cumin AI Context Source>",
embedding_function = embedding
)
# Get Langchain retreiver for Appending to Chain.
num_docs_to_retrieve = ... # number of docs to retrieve. Defaults to 4
retriever = client.as_retriever(search_kwargs={"k": num_docs_to_retrieve})
# Get Langchain retreiver with document with at least one of the tags.
num_docs_to_retrieve = ... # number of docs to retrieve. Defaults to 4
has_any_of_these_tags = ["<document tag 1>", "<document tag 2>"] # only docs with at lease one of these tags will be returned from Cumin AI knowledge base
retriever = client.as_retriever(search_kwargs={"k": num_docs_to_retrieve, "cuminai_tags": has_any_of_these_tags})
For Creators
Publishing knowledge is simple with Cumin AI. Currently we support the following knowledge types:
- Links - scrapable URLs can be given as input
- Text files - .txt and .md files can be given as input. The text files should be in the same directory where
CUMINFILE.yaml
exists.
To upload knowledge to Cumin AI, the creators must first create a CUMINFILE.yaml
in their project directory.
Sample CUMINFILE.yaml for getting started:
name: "<name of knowledge source>"
kind: LINK
version:
tag: <tag name>
latest: true
type: PUBLIC
embedding: ollama/nomic-embed-text:v1.5
tag:
type: global
chunkstrategy:
size: 1024
overlap: 100
knowledge:
- source: "<enter url for first link source>"
- source: "<enter url for second link source>"
- source: "<enter url for third link source>"
For text based knowledge sample CUMINFILE is given below:
name: "<name of knowledge source>"
kind: TEXT
version:
tag: v1
latest: true
type: PRIVATE
embedding: ollama/nomic-embed-text:v1.5
tag:
type: local
minoccurances: 1
chunkstrategy:
size: 1024
overlap: 100
knowledge:
-
source: "<enter name with extension for first text file>"
metadata:
tags:
- <document tag 1>
- <document tag 2>
- source: "<enter name with extension for second text file>"
- source: "<enter name with extension for third text file>"
Then make sure you have latest version of cuminai
pip install cuminai
Subsequently login into Cumin AI using your username and api key obtained from Cumin AI dashboard.
cuminai login --username <username> --apikey <Cumin AI API Key>
once you have authenticated, go to the project directory and validate your CUMINFILE.yaml
by running the following command from your terminal
cuminai validate
then once the validation is successful, you can deploy your knowledge to Cumin AI using the below command
cuminai deploy
Post deployment your knowledge will be available for Cumin AI users at
@<username>/<name of knowledge source>
this knowledge source can be accessed in python
# Access Cumin AI Client
from cuminai import CuminAI
embedding = ... # use a LangChain Embeddings class
client = CuminAI(
source="@<username>/<name of knowledge source>:<version of knowledge>",
embedding_function = embedding
)
if <version of knowledge>
is left empty then the latest version of knowledge is used.
you can logout of Cumin AI by typing the below on your terminal
cuminai logout
Release
Currently Cumin AI is in pre-release
mode. We have exciting things planned. You can check out our roadmap to know more.
License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file cuminai-0.0.3.tar.gz
.
File metadata
- Download URL: cuminai-0.0.3.tar.gz
- Upload date:
- Size: 14.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.12.3 Windows/10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0a0c69fbf00be26e033ec856129fe4c00fd6247c11fba2c02328e73d9a51159f |
|
MD5 | e28d94ee54c8b5fd45f3ae75fea05f46 |
|
BLAKE2b-256 | 3a323c43b859edc334d3b5f410f9af8b38099cf061544c029ba9b02f3688aaab |
File details
Details for the file cuminai-0.0.3-py3-none-any.whl
.
File metadata
- Download URL: cuminai-0.0.3-py3-none-any.whl
- Upload date:
- Size: 17.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.12.3 Windows/10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7c206dd1b0f512ca9a5131dc799635e3fd1cc06d4957788542fb19fd72368c04 |
|
MD5 | 5b845966f30071b7ab9d9a0c51b0ed02 |
|
BLAKE2b-256 | 3b82534e36e7db5727a1b4adf2c8de985bac93fd232a21f9bc9da0dede5e61c3 |