Skip to main content

No project description provided

Project description

cuminai

This package contains the Cumin AI Python SDK. Cumin AI is a Managed LLM Context Service. This package provides integration with Langchain.

Installation

pip install cuminai

In the rare scenario, if you are on Windows, and you get File Too Long error for any dependency package while installing cuminai. Run the below command to fix it.

git config --global core.longpaths true

Usage

The cuminai class helps easily access the Cumin AI Context store.

# Setup API key
import os
from getpass import getpass

CUMINAI_API_KEY = getpass("Enter Cumin AI API Key: ")
os.environ["CUMINAI_API_KEY"] = CUMINAI_API_KEY
# Access Cumin AI Client
from cuminai import CuminAI

embedding =  ... # use a LangChain Embeddings class

client = CuminAI(
    source="<Cumin AI Context Source>",
    embedding_function = embedding
)
# Get Langchain retreiver for Appending to Chain.
num_docs_to_retrieve = ... # number of docs to retrieve. Defaults to 4
retriever = client.as_retriever(search_kwargs={"k": num_docs_to_retrieve})
# Get Langchain retreiver with document with at least one of the tags.
num_docs_to_retrieve = ... # number of docs to retrieve. Defaults to 4
has_any_of_these_tags = ["<document tag 1>", "<document tag 2>"] # only docs with at lease one of these tags will be returned from Cumin AI knowledge base
retriever = client.as_retriever(search_kwargs={"k": num_docs_to_retrieve, "cuminai_tags": has_any_of_these_tags})

For Creators

Publishing knowledge is simple with Cumin AI. Currently we support the following knowledge types:

  • Links - scrapable URLs can be given as input
  • Text files - .txt and .md files can be given as input. The text files should be in the same directory where CUMINFILE.yaml exists.

To upload knowledge to Cumin AI, the creators must first create a CUMINFILE.yaml in their project directory.

Sample CUMINFILE.yaml for getting started:

name: "<name of knowledge source>"
kind: LINK
version: 1
type: PUBLIC
embedding: ollama/nomic-embed-text:v1.5
tag:
    type: global
chunkstrategy:
    size: 1024
    overlap: 100
knowledge:
    - source: "<enter url for first link source>"
    - source: "<enter url for second link source>"
    - source: "<enter url for third link source>"

For text based knowledge sample CUMINFILE is given below:

name: "<name of knowledge source>"
kind: TEXT
version: 1
type: PRIVATE
embedding: ollama/nomic-embed-text:v1.5
tag:
    type: local
    minoccurances: 1
chunkstrategy:
    size: 1024
    overlap: 100
knowledge:
    - 
        source: "<enter name with extension for first text file>"
        metadata:
            tags:
                - <document tag 1>
                - <document tag 2>
    - source: "<enter name with extension for second text file>"
    - source: "<enter name with extension for third text file>"

Then make sure you have latest version of cuminai

pip install cuminai

Subsequently login into Cumin AI using your username and api key obtained from Cumin AI dashboard.

cuminai login --username <username> --apikey <Cumin AI API Key>

once you have authenticated, go to the project directory and validate your CUMINFILE.yaml by running the following command from your terminal

cuminai validate

then once the validation is successful, you can deploy your knowledge to Cumin AI using the below command

cuminai deploy

Post deployment your knowledge will be available for Cumin AI users at

@<username>/<name of knowledge source>

this knowledge source can be accessed in python

# Access Cumin AI Client
from cuminai import CuminAI

embedding =  ... # use a LangChain Embeddings class

client = CuminAI(
    source="@<username>/<name of knowledge source>",
    embedding_function = embedding
)

you can logout of Cumin AI by typing the below on your terminal

cuminai logout

Release

Currently Cumin AI is in pre-release mode. We have exciting things planned. You can check out our roadmap to know more.

License

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cuminai-0.0.1.dev295.tar.gz (13.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cuminai-0.0.1.dev295-py3-none-any.whl (16.4 kB view details)

Uploaded Python 3

File details

Details for the file cuminai-0.0.1.dev295.tar.gz.

File metadata

  • Download URL: cuminai-0.0.1.dev295.tar.gz
  • Upload date:
  • Size: 13.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.3 Windows/10

File hashes

Hashes for cuminai-0.0.1.dev295.tar.gz
Algorithm Hash digest
SHA256 b71a89a81ac6a4a26cf6228aa04b8e02ab78e8a0dec43af989f4db70c7505b9d
MD5 f10f85b2e9735fa1f190ab5f6741ff5c
BLAKE2b-256 98ab3dd66959d9463c1fae6e8c1911cd6b42da2229db067779951bbf1873df03

See more details on using hashes here.

File details

Details for the file cuminai-0.0.1.dev295-py3-none-any.whl.

File metadata

  • Download URL: cuminai-0.0.1.dev295-py3-none-any.whl
  • Upload date:
  • Size: 16.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.3 Windows/10

File hashes

Hashes for cuminai-0.0.1.dev295-py3-none-any.whl
Algorithm Hash digest
SHA256 0478352e173b9446ba7f8669875d4a06eb5ef4afa50942dd55ab3192676726b9
MD5 8e51d984034e8a87c5eff438a7b3022e
BLAKE2b-256 31b1b8dab063b5b41eddb7aaecedb49e8c9cfda40f060e1faf3c0b799a8a3c5f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page