LangChain plugin for Icosa Computing's Combinatorial Reasoning generative AI pipeline.
Project description
Icosa Computing LangChain Package
The Icosa Computing LangChain Library provides convenient access to our flagship Combinatorial Reasoning LLM pipeline with out of the box support and integration with the LangChain LLM application framework.
See our live demo website here!
See our arXiv preprint here!
Installation
pip install langchain_icosa
Because all processing occurs off-premises on Icosa Computing servers, this library only requires the LangChain core and OpenAI libraries.
You must also have a valid OpenAI API key that will be called by the LLM 212 times per request. This can be done by storing the API key found on the OpenAI API key page.
Usage
The Combinatorial Reasoning LLM provides support for all functions defined by the LangChain Runnable interface. However, streaming is limited to the final output after the final reasons have been selected by the optimizer.
Initialization
from langchain_icosa.combinatorial_reasoning import CombinatorialReasoningLLM
llm = CombinatorialReasoningLLM(
linear_sensitivity = 1.0,
thresh_param = 1.0,
risk_param = 1.0,
weight = 2,
openai_api_key = API_KEY, # can also be implicitly passed in via `OPENAI API KEY` environment variable
model = 'gpt-4o' # defaults to gpt-4o-mini
)
The default hyper-parameters have already been tuned by the Icosa team, so using the default hyper-parameters should suffice for almost all use-cases.
Basic Calls
llm = CombinatorialReasoningLLM(openai_api_key=API_KEY)
llm.invoke("There are six animals: lion, hyena, elephant, deer, cat and mouse. Separate them to three spaces to minimize conflict.", responseType='answerWithReasoning') # includes reasoning for solution
llm.invoke("There are six animals: lion, hyena, elephant, deer, cat and mouse. Separate them to three spaces to minimize conflict.") # excludes reasoning for solution
The invoke method supports 6 keyword arguments:
linear_sensitivity
Overrides the linear sensitivity parameter.thresh_param
: Overrides the thresh parameter.risk_param
: Overrides the risk parameter.weight
: Overrides the weight parameter.responseType
: Whether or not to include the LLM's reasoning in the response. Must be one ofanswer
(default) oranswerWithReasoning
.seed
: Sets the seed for the LLM. Defaults to 0.
Streaming
llm = CombinatorialReasoningLLM(openai_api_key=API_KEY, model='gpt-4o')
for token in llm.stream("Should I buy AMZN stock today?"):
print(token, end="", flush=True)
Like the invoke method, streaming supports the following keyword arguments:
linear_sensitivity
Overrides the linear sensitivity parameter.thresh_param
: Overrides the thresh parameter.risk_param
: Overrides the risk parameter.weight
: Overrides the weight parameter.seed
: Sets the seed for the LLM. Defaults to 0.
However, unlike invoke, streaming does not support different response types. All streams will include the final reasoning.
Callbacks
The LLM supports access to the sampled reasons, the distinct sampled reasons, and the final reasons selected in the annealing step via the CombinatorialReasoningCallbackHandler
. Example usage is shown below.
from langchain_icosa.combinatorial_reasoning import CombinatorialReasoningLLM, CombinatorialReasoningCallbackHandler
llm = CombinatorialReasoningLLM(openai_api_key=API_KEY)
callback = CombinatorialReasoningCallbackHandler()
prompt = "There are six animals: lion, hyena, elephant, deer, cat and mouse. Separate them to three spaces to minimize conflict."
print(llm.invoke(prompt, config = {'callbacks': [callback]}))
print(f"Statistics: {callback.stats}")
print(f"Raw reasons: {callback.data}")
Using a Chain
from langchain_core.prompts import PromptTemplate
from langchain_core.output_parsers import StrOutputParser
from langchain_icosa.combinatorial_reasoning import CombinatorialReasoningLLM
prompt = PromptTemplate.from_template("Should I buy {ticker} stock today?")
model = CombinatorialReasoningLLM(openai_api_key=API_KEY)
chain = prompt | model | StrOutputParser()
chain.invoke({'ticker': 'AAPL'})
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for langchain_icosa-1.0.3-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | b6ad3c2a6ca3ea09a9850f3d15418930651f254ab4558a7c2645cf8ff8d50ebb |
|
MD5 | c41ce8e20142a29c21dc7f09e5722773 |
|
BLAKE2b-256 | f11bb93314f7c88ee8e856af5678190cec0d381a7102e09cd9252cf56ecc58a9 |