LangChain plugin for Icosa Computing's Combinatorial Reasoning generative AI pipeline.
Project description
Icosa Computing LangChain Package
The Icosa Computing LangChain Library provides convenient access to our flagship Combinatorial Reasoning LLM pipeline with out of the box support and integration with the LangChain LLM application framework.
See our live demo website here!
See our arXiv preprint here!
Installation
pip install langchain_icosa
Because all processing occurs off-premises on Icosa Computing servers, this library only requires the LangChain core and OpenAI libraries.
You must also have a valid OpenAI API key that will be called by the LLM 212 times per request. This can be done by storing the API key found on the OpenAI API key page.
Usage
The Combinatorial Reasoning LLM provides support for all functions defined by the LangChain Runnable interface. However, streaming is limited to the final output after the final reasons have been selected by the optimizer.
Initialization
from langchain_icosa.combinatorial_reasoning import CombinatorialReasoningLLM
llm = CombinatorialReasoningLLM(
linear_sensitivity = 1.0,
thresh_param = 1.0,
risk_param = 1.0,
weight = 2,
openai_api_key = API_KEY, # can also be implicitly passed in via `OPENAI API KEY` environment variable
model = 'gpt-4o' # defaults to gpt-4o-mini
)
The default hyper-parameters have already been tuned by the Icosa team, so using the default hyper-parameters should suffice for almost all use-cases.
Basic Calls
llm = CombinatorialReasoningLLM(openai_api_key=API_KEY)
llm.invoke("There are six animals: lion, hyena, elephant, deer, cat and mouse. Separate them to three spaces to minimize conflict.", responseType='answerWithReasoning') # includes reasoning for solution
llm.invoke("There are six animals: lion, hyena, elephant, deer, cat and mouse. Separate them to three spaces to minimize conflict.") # excludes reasoning for solution
The invoke method supports 6 keyword arguments:
linear_sensitivity
Overrides the linear sensitivity parameter.thresh_param
: Overrides the thresh parameter.risk_param
: Overrides the risk parameter.weight
: Overrides the weight parameter.responseType
: Whether or not to include the LLM's reasoning in the response. Must be one ofanswer
(default) oranswerWithReasoning
.seed
: Sets the seed for the LLM. Defaults to 0.
Streaming
llm = CombinatorialReasoningLLM(openai_api_key=API_KEY, model='gpt-4o')
for token in llm.stream("Should I buy AMZN stock today?"):
print(token, end="", flush=True)
Like the invoke method, streaming supports the following keyword arguments:
linear_sensitivity
Overrides the linear sensitivity parameter.thresh_param
: Overrides the thresh parameter.risk_param
: Overrides the risk parameter.weight
: Overrides the weight parameter.seed
: Sets the seed for the LLM. Defaults to 0.
However, unlike invoke, streaming does not support different response types. All streams will include the final reasoning.
Callbacks
The LLM supports access to the sampled reasons, the distinct sampled reasons, and the final reasons selected in the annealing step via the CombinatorialReasoningCallbackHandler
. Example usage is shown below.
from langchain_icosa.combinatorial_reasoning import CombinatorialReasoningLLM, CombinatorialReasoningCallbackHandler
llm = CombinatorialReasoningLLM(openai_api_key=API_KEY)
callback = CombinatorialReasoningCallbackHandler()
prompt = "There are six animals: lion, hyena, elephant, deer, cat and mouse. Separate them to three spaces to minimize conflict."
print(llm.invoke(prompt, config = {'callbacks': [callback]}))
print(f"Statistics: {callback.stats}")
print(f"Raw reasons: {callback.data}")
Using a Chain
from langchain_core.prompts import PromptTemplate
from langchain_core.output_parsers import StrOutputParser
from langchain_icosa.combinatorial_reasoning import CombinatorialReasoningLLM
prompt = PromptTemplate.from_template("Should I buy {ticker} stock today?")
model = CombinatorialReasoningLLM(openai_api_key=API_KEY)
chain = prompt | model | StrOutputParser()
chain.invoke({'ticker': 'AAPL'})
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file langchain_icosa-1.1.2.tar.gz
.
File metadata
- Download URL: langchain_icosa-1.1.2.tar.gz
- Upload date:
- Size: 8.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 390b51b3de0d881505e8e228a96790af85213166760be05922a4e7a74993c8ed |
|
MD5 | 89d814e0f3eb6e370607364caa03c285 |
|
BLAKE2b-256 | 1a6bb0d8ab334b0349779343370dd47a95dcc11f394118b0722d1e2e7bf481ba |
File details
Details for the file langchain_icosa-1.1.2-py3-none-any.whl
.
File metadata
- Download URL: langchain_icosa-1.1.2-py3-none-any.whl
- Upload date:
- Size: 8.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1a32d6ea4fa001e28a621e6090963db68ecba152c0eecb5cef80a7a432d4cad8 |
|
MD5 | 6445da64bebbf9060f83a8f0cac16ced |
|
BLAKE2b-256 | 52f9d971bd44639142b4eb339b453f77a2236e46e420ec327ba9367bd0a4817a |