Skip to main content

Building applications with LLMs through composability

Project description

🦜🍎️ LangChain Core

Downloads License: MIT

Quick Install

pip install langchain-core

What is it?

LangChain Core contains the base abstractions that power the rest of the LangChain ecosystem.

These abstractions are designed to be as modular and simple as possible. Examples of these abstractions include those for language models, document loaders, embedding models, vectorstores, retrievers, and more.

The benefit of having these abstractions is that any provider can implement the required interface and then easily be used in the rest of the LangChain ecosystem.

For full documentation see the API reference.

1️⃣ Core Interface: Runnables

The concept of a Runnable is central to LangChain Core – it is the interface that most LangChain Core components implement, giving them

  • a common invocation interface (invoke, batch, stream, etc.)
  • built-in utilities for retries, fallbacks, schemas and runtime configurability
  • easy deployment with LangServe

For more check out the runnable docs. Examples of components that implement the interface include: LLMs, Chat Models, Prompts, Retrievers, Tools, Output Parsers.

You can use LangChain Core objects in two ways:

  1. imperative, ie. call them directly, eg. model.invoke(...)

  2. declarative, with LangChain Expression Language (LCEL)

  3. or a mix of both! eg. one of the steps in your LCEL sequence can be a custom function

Feature Imperative Declarative
Syntax All of Python LCEL
Tracing ✅ – Automatic ✅ – Automatic
Parallel ✅ – with threads or coroutines ✅ – Automatic
Streaming ✅ – by yielding ✅ – Automatic
Async ✅ – by writing async functions ✅ – Automatic

⚡️ What is LangChain Expression Language?

LangChain Expression Language (LCEL) is a declarative language for composing LangChain Core runnables into sequences (or DAGs), covering the most common patterns when building with LLMs.

LangChain Core compiles LCEL sequences to an optimized execution plan, with automatic parallelization, streaming, tracing, and async support.

For more check out the LCEL docs.

Diagram outlining the hierarchical organization of the LangChain framework, displaying the interconnected parts across multiple layers.

For more advanced use cases, also check out LangGraph, which is a graph-based runner for cyclic and recursive LLM workflows.

📕 Releases & Versioning

langchain-core is currently on version 0.1.x.

As langchain-core contains the base abstractions and runtime for the whole LangChain ecosystem, we will communicate any breaking changes with advance notice and version bumps. The exception for this is anything in langchain_core.beta. The reason for langchain_core.beta is that given the rate of change of the field, being able to move quickly is still a priority, and this module is our attempt to do so.

Minor version increases will occur for:

  • Breaking changes for any public interfaces NOT in langchain_core.beta

Patch version increases will occur for:

  • Bug fixes
  • New features
  • Any changes to private interfaces
  • Any changes to langchain_core.beta

💁 Contributing

As an open-source project in a rapidly developing field, we are extremely open to contributions, whether it be in the form of a new feature, improved infrastructure, or better documentation.

For detailed information on how to contribute, see the Contributing Guide.

⛰️ Why build on top of LangChain Core?

The whole LangChain ecosystem is built on top of LangChain Core, so you're in good company when building on top of it. Some of the benefits:

  • Modularity: LangChain Core is designed around abstractions that are independent of each other, and not tied to any specific model provider.
  • Stability: We are committed to a stable versioning scheme, and will communicate any breaking changes with advance notice and version bumps.
  • Battle-tested: LangChain Core components have the largest install base in the LLM ecosystem, and are used in production by many companies.
  • Community: LangChain Core is developed in the open, and we welcome contributions from the community.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_core-0.1.53.tar.gz (236.7 kB view details)

Uploaded Source

Built Distribution

langchain_core-0.1.53-py3-none-any.whl (303.1 kB view details)

Uploaded Python 3

File details

Details for the file langchain_core-0.1.53.tar.gz.

File metadata

  • Download URL: langchain_core-0.1.53.tar.gz
  • Upload date:
  • Size: 236.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for langchain_core-0.1.53.tar.gz
Algorithm Hash digest
SHA256 df3773a553b5335eb645827b99a61a7018cea4b11dc45efa2613fde156441cec
MD5 9442901a29d35beed6109193e0040a5d
BLAKE2b-256 e9653aaff91481b9d629a31630a40000d403bff24b3c62d9abc87dc998298cce

See more details on using hashes here.

Provenance

The following attestation bundles were made for langchain_core-0.1.53.tar.gz:

Publisher: _release.yml on langchain-ai/langchain

Attestations:

File details

Details for the file langchain_core-0.1.53-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_core-0.1.53-py3-none-any.whl
Algorithm Hash digest
SHA256 02a88a21e3bd294441b5b741625fa4b53b1c684fd58ba6e5d9028e53cbe8542f
MD5 6da4a01ae57a4f0f2bee8666b9e71e76
BLAKE2b-256 6a10285fa149ce95300d91ea0bb124eec28889e5ebbcb59434d1fe2f31098d72

See more details on using hashes here.

Provenance

The following attestation bundles were made for langchain_core-0.1.53-py3-none-any.whl:

Publisher: _release.yml on langchain-ai/langchain

Attestations:

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page