Skip to main content

Python client for interacting with AnswerRocket's skill API

Project description

AnswerRocket Skill API Client

This is a client library for interacting with an AnswerRocket instance.

Installation

pip install answerrocket-client

Use

from answer_rocket import AnswerRocketClient
arc = AnswerRocketClient(url='https://your-answerrocket-instance.com', token='<your_api_token>')

# test that the config is valid
arc.can_connect()

# Get a resource file.  When running in an AnswerRocket instance, this call will fetch a customized version of the resource if one has been created.
import json
some_resource = json.loads(arc.config.get_artifact('path/to/my/file.json'))

# to run SQL, get the database ID from an AnswerRocket environment
table_name = "my_table"
sql = "SELECT sum(my_measure) from "+table_name
database_id = "my_database_id"

execute_sql_query_result = arc.data.execute_sql_query(database_id, sql, 100)

if execute_sql_query_result.success:
    print(execute_sql_query_result.df)    
else:
    print(execute_sql_query_result.error)
    print(execute_sql_query_result.code)

# language model calls use the configured settings from the connected Max instance (except for the secret key)
success, model_reply = arc.chat.completion(messages = "hakuna")

if success:
    # the reply is the full value of the LLM's return object
    reply = model_reply["choices"][0]["message"]["content"]
    print(f"** {reply} **")
else:
    # error reply is a description of the exception
    print("Error: "+model_reply)

# chat conversations and streaming replies are supported
messages = [
    { "role":"system",
      "content":"You are an efficient assistant helping a business user answer questions about data."},
    { "role":"user",
      "content":"Can you tell me the average of 150,12,200,54,24 and 32?  are any of these outliers?  Explain why."}
]

def display_streaming_result(str):
    print(str,end="", flush=True)

success, reply = arc.chat.completion(messages = messages, stream_callback=display_streaming_result)

Notes:

  • both the token and instance URL can be provided via the AR_TOKEN and AR_URL env vars instead, respectively. This is recommended to avoid accidentally committing a dev api token in your skill code. API token is available through the AnswerRocket UI for authenticated users.
  • when running outside of an AnswerRocket installation such as during development, make sure the openai key is set before importing answer_rocket, like os.environ['OPENAI_API_KEY'] = openai_completion_key. Get this key from OpenAI.

Working on the SDK

Setup

This repository contains a .envrc file for use with direnv. With that installed you should have a separate python interpreter that direnv's hook will activate for you when you cd into this repository.

Once you have direnv set up and activating inside the repo, just make to install dev dependencies and get started.

Finding things in the codebase

The main point of contact with users of this sdk is AnswerRocketClient in answer_rocket/client.py. That is, it is what users will import and initialize. Different categories of utilities can be grouped into modules in whatever way is most convenient, but they should be exposed via the client rather than through a separate import so that utilities for authentication, etc., can be reused.

The client hits an sdk-specific GraphQL API on its target AnswerRocket server. There is a graphql/schema.py with generated python types for what queries are available. When needed it can be regenerated with the generate-gql-schema makefile target. See the Makefile for details.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

answerrocket_client-0.2.23.tar.gz (26.7 kB view details)

Uploaded Source

Built Distribution

answerrocket_client-0.2.23-py3-none-any.whl (28.8 kB view details)

Uploaded Python 3

File details

Details for the file answerrocket_client-0.2.23.tar.gz.

File metadata

  • Download URL: answerrocket_client-0.2.23.tar.gz
  • Upload date:
  • Size: 26.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for answerrocket_client-0.2.23.tar.gz
Algorithm Hash digest
SHA256 7dfe365ca18c891fa1fb6d60345f3aaa94a4e6328493107f0be1b47c8239da45
MD5 38e0b13ed82a8d8cba03ef3d76589510
BLAKE2b-256 65c464a3d75da31b06bd70c229dc8cebeb4d6a879fbe342748aebcf0f8ace047

See more details on using hashes here.

Provenance

The following attestation bundles were made for answerrocket_client-0.2.23.tar.gz:

Publisher: publish-to-pypi.yml on answerrocket/answerrocket-python-client

Attestations:

File details

Details for the file answerrocket_client-0.2.23-py3-none-any.whl.

File metadata

File hashes

Hashes for answerrocket_client-0.2.23-py3-none-any.whl
Algorithm Hash digest
SHA256 ba39b9c82323575951a130239301b22f05287044c2378075106e2c27904c6602
MD5 1a396cd1f88bce29c24a896de95736b2
BLAKE2b-256 46dcdfac939f0b0e85e74038cf2e9e8b6d599ba9efa83a6e142ed24cf5236627

See more details on using hashes here.

Provenance

The following attestation bundles were made for answerrocket_client-0.2.23-py3-none-any.whl:

Publisher: publish-to-pypi.yml on answerrocket/answerrocket-python-client

Attestations:

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page