Skip to main content

The Anaconda Assistant Python client

Project description

anaconda-assistant

The Anaconda Assistant Python client

Installation

conda install -c anaconda-cloud anaconda-assistant-sdk

How to authenticate

To use the Python client or CLI you can use anaconda login CLI, Anaconda Navigator, or

from anaconda_auth import login
login()

to launch a browser to login and save your API token to disk. For cases where you cannot utilize a browser to login you can grab your API and set the ANACONDA_AUTH_API_KEY=<api-key> env var.

The Python clients and integrations provide api_key as a keyword argument.

Terms of use and data collection

In order to use the Anaconda Assistant SDK and derived integrations the user must first agree to

On Data Collection: If you opt-in you will enjoy personalized recommendations and contribute to smarter features.

We prioritize your privacy:

  • Your data is never sold
  • Always secured
  • This setting only affects the data Anaconda stores
  • It does not affect the data that is sent to Open AI

To agree to the terms of service and configure data collection edit the ~/.anaconda/config.toml file

[plugin.assistant]
accepted_terms = true
data_collection = true

You may set data_collection = false if you chose to opt-out.

If you set accepted_terms = false the Anaconda Assistant SDK and derived integrations will not function.

If either or both of these values are unset in the ~/.anaconda/config.toml file, an exception will be raised.

Chat session

The ChatSession provides a multi-turn chat interface that saves input and output messages. The response can be streamed or provided in one chunk.

from anaconda_assistant import ChatSession

chat = ChatSession()

text = chat.completions("what are the the first 5 fibonacci numbers?", stream=False)
print(text)

text = chat.completions("make that the first 10 numbers", stream=True)
for chunk in text:
    print(chunk, end="")

Chat client

The ChatClient provides a low-level completions function that accepts a list of messages in the same format as OpenAI. The completions() method returns a ChatResponse object that allows streaming of the response similar to requests Response with .iter_content(), .iter_lines(), and .message, which returns the whole message as a string. Once the who message has been access or consumed through an iterable it is retained in the .message object. Additionally, the response object stores .tokens_used and .tokens_limit integers.

from anaconda_assistant import ChatClient

client = ChatClient()

messages = [
    {"role": "user", "content": "What is pi?"}
]

response = client.completions(messages=messages)

for chunk in response.iter_content():
    print(chunk, end="")

You can only consume the message with .iter_content() once, but the result is captured to the .message attribute while streaming.

Daily quotas

Each Anaconda subscription plan enforces a limit on the number of requests (calls to .completions()). The limits are documented on the Plans and Pricing page. Once the limit is reached the .completions() function will throw a DailyQuotaExceeded exception.

Users can upgrade their plans by visiting https://anaconda.cloud/profile/subscriptions.

Integrations

A number of 3rd party integrations are provided. In each case you will need to have optional packages installed.

LLM CLI

The LLM CLI can be use to send and receive messages with the Anaconda Assistant.

Required packages: llm

To direct your messages to Anaconda Assistant use the model name anaconda-assistant

> llm -m anaconda-assistant 'what is pi?'

LlamaIndex

To use the LlamaIndex integration you will need to install at least llama-index-core

Required packages: llama-index-core

The AnacondaAssistant class supports streaming and non-streaming completions and chat methods. A system prompt can be provided to AnacondaAssistant with the system_prompt keyword argument

from anaconda_assistant.integrations.llama_index import AnacondaAssistant

llm = AnacondaAssistant()

# Completions example
for c in llm.stream_complete('who are you?'):
    print(c.delta, end='')

# Chat example
from llama_index.core.llms import ChatMessage
response = model.chat(messages=[ChatMessage(content="Who are you?")])

# custom system prompt
prompted = AnacondaAssistant(system_prompt='you are a kitty, you will response with meow!')
print(prompted.complete('what is pi?'))

LangChain

A LangChain integration is provided that supports message streaming and non-streaming responses.

Required packages: langchain-core >=0.3 and langchain >=0.3

from anaconda_assistant.integrations.langchain import AnacondaAssistant
from langchain.prompts import ChatPromptTemplate

model = AnacondaAssistant()
prompt = ChatPromptTemplate.from_template("tell me a joke about {topic}")
chain = prompt | model

message = chain.invoke({'topic': 'python'})
print(message.content)

ELL

You can use Anaconda Assistant as a model in the ell prompt engineering framework.

Required packages: ell-ai[sqlite] or ell-ai[postgres]

import ell
import anaconda_assistant.integrations.ell

ell.init(verbose=True)

@ell.simple(model="anaconda-assistant")
def who():
    return "Who are you?"

who()

PandasAI

To use Anaconda Assistant with PandasAI configure the SmartDataFrame using the AnacondaAssistant plugin

Required packages: pandasai

from anaconda_assistant.integrations.pandasai import AnacondaAssistant
from pandasai import SmartDataframe

ai = AnacondaAssistant()
sdf = SmartDataframe(df, config={'llm': ai})
sdf.chat('what is the average of this column where some condition is true?')

Panel

You can integrate the Anaconda Assistant in your Panel application using the chat features including the name and avatar for the signed in user.

Required packages: panel

import panel as pn

from anaconda_auth import BaseClient
from anaconda_assistant.integrations.panel import AnacondaAssistantCallbackHandler

callback = AnacondaAssistantCallbackHandler()
auth_client = BaseClient()

chat = pn.chat.ChatInterface(
    callback=callback,
    user=auth_client.name,
    avatar=auth_client.avatar,
    placeholder_threshold=0.05
)

Setup for development

Ensure you have conda installed. Then run:

make setup

Run the unit tests

make test

Run the unit tests across isolated environments with tox

make tox

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

anaconda_assistant_sdk-0.5.2.tar.gz (31.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

anaconda_assistant_sdk-0.5.2-py3-none-any.whl (31.8 kB view details)

Uploaded Python 3

File details

Details for the file anaconda_assistant_sdk-0.5.2.tar.gz.

File metadata

  • Download URL: anaconda_assistant_sdk-0.5.2.tar.gz
  • Upload date:
  • Size: 31.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-httpx/0.28.1

File hashes

Hashes for anaconda_assistant_sdk-0.5.2.tar.gz
Algorithm Hash digest
SHA256 e9375aa828faf46ae4025a6312f85f5744ca621592da53ca9dcb98a47413ea64
MD5 d11b244776e1260178ec999c1bfdd427
BLAKE2b-256 062256fdbced8ce3868a3b2a3a1bc3face7b0386fd9c638128c23ff21c6878e7

See more details on using hashes here.

File details

Details for the file anaconda_assistant_sdk-0.5.2-py3-none-any.whl.

File metadata

File hashes

Hashes for anaconda_assistant_sdk-0.5.2-py3-none-any.whl
Algorithm Hash digest
SHA256 9a2d359a6c423fe85df16de477c410d4221b666135238caf9c978f93de265b0c
MD5 f70b7f3a6bcbaa1239cd0fe3ba600c18
BLAKE2b-256 96a2bf928b4baf86286819d23d4b824d8ece9439533298859a5861ea80cc524a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page