Skip to main content

An integration package connecting Together AI and LangChain

Project description

langchain-together

This package contains the LangChain integration with Together AI.

Installation

pip install langchain-together

Chat Models

ChatTogether supports the various models available via the Together API:

from langchain_together import ChatTogether
import os


os.environ["TOGETHER_API_KEY"] = "my-key"


llm = ChatTogether(
    model="meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo",
    temperature=0,
    max_tokens=None,
    timeout=None,
    max_retries=2,
    # api_key="...",  # if not set in environment variable
)

Structured Outputs, Function Calls, JSON Mode

ChatTogether supports structured outputs using Pydantic models, dictionaries, or JSON schemas. This feature allows you to get reliable, structured responses from Together AI models. See here the docs for more info about function calling and structured outputs

from langchain_together import ChatTogether
from pydantic import BaseModel, Field
from typing import Optional

class Joke(BaseModel):
    setup: str = Field(description="The setup of the joke")
    punchline: str = Field(description="The punchline of the joke")
    rating: Optional[int] = Field(default=None, description="How funny the joke is from 1-10")

# Use a model that supports function calling
llm = ChatTogether(model="meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo")
structured_llm = llm.with_structured_output(Joke.model_json_schema(), method="json_schema")

result = structured_llm.invoke("Tell me a joke about programming")
print(f"Setup: {result.setup}")
print(f"Punchline: {result.punchline}")
print(f"Rating: {result.rating}")

Function Calling

# Use a model that supports function calling
llm = ChatTogether(model="meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo")
structured_llm = llm.with_structured_output(Joke, method="function_calling")

result = structured_llm.invoke("Tell me a joke about programming")
print(f"Setup: {result.setup}")
print(f"Punchline: {result.punchline}")
print(f"Rating: {result.rating}")

JSON Mode

For models that support JSON mode, you can also use this method:

from langchain_together import ChatTogether
from pydantic import BaseModel, Field

class Response(BaseModel):
    message: str = Field(description="The main message")
    category: str = Field(description="Category of the response")

# Use a model that supports JSON mode
llm = ChatTogether(model="meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo")
structured_llm = llm.with_structured_output(Response.model_json_schema(), method="json_mode")

result = structured_llm.invoke(
    "Respond with a JSON containing a message about cats and categorize it. "
    "Use the exact keys 'message' and 'category'."
)

Embeddings

from langchain_together import TogetherEmbeddings

embeddings = TogetherEmbeddings(model="BAAI/bge-base-en-v1.5")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_together-0.3.1.tar.gz (10.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

langchain_together-0.3.1-py3-none-any.whl (12.9 kB view details)

Uploaded Python 3

File details

Details for the file langchain_together-0.3.1.tar.gz.

File metadata

  • Download URL: langchain_together-0.3.1.tar.gz
  • Upload date:
  • Size: 10.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for langchain_together-0.3.1.tar.gz
Algorithm Hash digest
SHA256 123a3d6da11c3a32c6e7653f9b52a7793789ddba5a871a2392db0ee3d389be78
MD5 6780b8cd1dbc5b9407830f71f7fb31b8
BLAKE2b-256 1a67b2f8a2ab856fe962905c3cdb2c96d531f6cf7ae74b55c6c0380b52f6c99d

See more details on using hashes here.

File details

Details for the file langchain_together-0.3.1-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_together-0.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f6638c3a1db2732a3e62ed0fdfbac0653765f0e561e7082320952731846945b2
MD5 6bc4853298b49aeee0e969c95a04336c
BLAKE2b-256 6a9469a4cae7c37a5619c610cc443ca8cdad5495438aed3f72fa936ed38cadf3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page