An integration package connecting Together AI and LangChain
Project description
langchain-together
This package contains the LangChain integration with Together AI.
Installation
pip install langchain-together
Chat Models
ChatTogether supports the various models available via the Together API:
from langchain_together import ChatTogether
import os
os.environ["TOGETHER_API_KEY"] = "my-key"
llm = ChatTogether(
model="meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo",
temperature=0,
max_tokens=None,
timeout=None,
max_retries=2,
# api_key="...", # if not set in environment variable
)
Structured Outputs, Function Calls, JSON Mode
ChatTogether supports structured outputs using Pydantic models, dictionaries, or JSON schemas. This feature allows you to get reliable, structured responses from Together AI models. See here the docs for more info about function calling and structured outputs
from langchain_together import ChatTogether
from pydantic import BaseModel, Field
from typing import Optional
class Joke(BaseModel):
setup: str = Field(description="The setup of the joke")
punchline: str = Field(description="The punchline of the joke")
rating: Optional[int] = Field(default=None, description="How funny the joke is from 1-10")
# Use a model that supports function calling
llm = ChatTogether(model="meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo")
structured_llm = llm.with_structured_output(Joke.model_json_schema(), method="json_schema")
result = structured_llm.invoke("Tell me a joke about programming")
print(f"Setup: {result.setup}")
print(f"Punchline: {result.punchline}")
print(f"Rating: {result.rating}")
Function Calling
# Use a model that supports function calling
llm = ChatTogether(model="meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo")
structured_llm = llm.with_structured_output(Joke, method="function_calling")
result = structured_llm.invoke("Tell me a joke about programming")
print(f"Setup: {result.setup}")
print(f"Punchline: {result.punchline}")
print(f"Rating: {result.rating}")
JSON Mode
For models that support JSON mode, you can also use this method:
from langchain_together import ChatTogether
from pydantic import BaseModel, Field
class Response(BaseModel):
message: str = Field(description="The main message")
category: str = Field(description="Category of the response")
# Use a model that supports JSON mode
llm = ChatTogether(model="meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo")
structured_llm = llm.with_structured_output(Response.model_json_schema(), method="json_mode")
result = structured_llm.invoke(
"Respond with a JSON containing a message about cats and categorize it. "
"Use the exact keys 'message' and 'category'."
)
Embeddings
from langchain_together import TogetherEmbeddings
embeddings = TogetherEmbeddings(model="BAAI/bge-base-en-v1.5")
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file langchain_together-0.3.1.tar.gz.
File metadata
- Download URL: langchain_together-0.3.1.tar.gz
- Upload date:
- Size: 10.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
123a3d6da11c3a32c6e7653f9b52a7793789ddba5a871a2392db0ee3d389be78
|
|
| MD5 |
6780b8cd1dbc5b9407830f71f7fb31b8
|
|
| BLAKE2b-256 |
1a67b2f8a2ab856fe962905c3cdb2c96d531f6cf7ae74b55c6c0380b52f6c99d
|
File details
Details for the file langchain_together-0.3.1-py3-none-any.whl.
File metadata
- Download URL: langchain_together-0.3.1-py3-none-any.whl
- Upload date:
- Size: 12.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f6638c3a1db2732a3e62ed0fdfbac0653765f0e561e7082320952731846945b2
|
|
| MD5 |
6bc4853298b49aeee0e969c95a04336c
|
|
| BLAKE2b-256 |
6a9469a4cae7c37a5619c610cc443ca8cdad5495438aed3f72fa936ed38cadf3
|