langchain chroma package for ally
Project description
Ally AI
How to use
Config File
llm:
api_key: '<private-key>'
api_version: "<api-version>"
endpoint: "<endpoint>"
model: "<model-name>"
deployment_name: '<deployment-name>'
temperature: 0.7
streaming: true
embeddings:
api_key: '<private-key>'
api_version: "<api-version>"
endpoint: "<endpoint>"
model: "<model-name>"
deployment_name: '<deployment-name>'
Create LLM
Use Default Settings in LLM
from ally_ai.llamaindex import LLM
llm = LLM()
response = llm.invoke('What is an ally?')
print(response)
Use Custom Settings in LLM
my_llm:
api_key: '<private-key>'
api_version: "<api-version>"
from ally_ai.llamaindex import LLM, Settings
settings = Settings(section='my_llm')
llm = LLM(settings=settings)
print(llm.settings.path)
print(llm.settings.section)
Override Settings in LLM
from ally_ai.llamaindex import LLM, Settings
settings = Settings(section='my_llm', api_key='<new-api-key>')
llm = LLM(settings=settings)
response = llm.invoke('What is an ally?')
print(response)
How to Create Embeddings
from ally_ai.llamaindex import Embeddings
embeddings = Embeddings()
response = embeddings.embed_query('What is an ally?')
print(response)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
ally_ai_chroma-0.0.2.tar.gz
(3.5 kB
view hashes)
Built Distribution
Close
Hashes for ally_ai_chroma-0.0.2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6a7a096c46223c4ad97bda31d7005cbf1e1d46dd7aa993f77f84d92eb43ce171 |
|
MD5 | 2d371ac2a2477dd6a5c75a3e530b75c2 |
|
BLAKE2b-256 | daa50f3314d20c7d7fcae1bd73a410cbfaa17e9a894ee84f0574995da562fe1c |