ally for your ai
Project description
Ally AI
How to use
Config File
llm:
api_key: '<private-key>'
api_version: "<api-version>"
endpoint: "<endpoint>"
model: "<model-name>"
deployment_name: '<deployment-name>'
temperature: 0.7
streaming: true
embeddings:
api_key: '<private-key>'
api_version: "<api-version>"
endpoint: "<endpoint>"
model: "<model-name>"
deployment_name: '<deployment-name>'
Create LLM
Use Default Settings in LLM
from ally_ai.langchain import LLM
llm = LLM()
response = llm.invoke('What is an ally?')
print(response)
Use Custom Settings in LLM
my_llm:
api_key: '<private-key>'
api_version: "<api-version>"
from ally_ai.langchain import LLM, Settings
settings = Settings(section='my_llm')
llm = LLM(settings=settings)
print(llm.settings.path)
print(llm.settings.section)
Override Settings in LLM
from ally_ai.langchain import LLM, Settings
settings = Settings(section='my_llm', api_key='<new-api-key>')
llm = LLM(settings=settings)
response = llm.invoke('What is an ally?')
print(response)
How to Create Embeddings
from ally_ai.langchain import Embeddings
embeddings = Embeddings()
response = embeddings.embed_query('What is an ally?')
print(response)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
ally_ai-0.1.1.tar.gz
(2.9 kB
view hashes)