Building applications with LLMs through composability
Project description
This project is a branch of langchain-core on QPython.
What is it?
LangChain Core contains the base abstractions that power the rest of the LangChain ecosystem.
These abstractions are designed to be as modular and simple as possible. Examples of these abstractions include those for language models, document loaders, embedding models, vectorstores, retrievers, and more.
The benefit of having these abstractions is that any provider can implement the required interface and then easily be used in the rest of the LangChain ecosystem.
For full documentation see the API reference.
1️⃣ Core Interface: Runnables
The concept of a Runnable is central to LangChain Core – it is the interface that most LangChain Core components implement, giving them
- a common invocation interface (invoke, batch, stream, etc.)
- built-in utilities for retries, fallbacks, schemas and runtime configurability
- easy deployment with LangServe
For more check out the runnable docs. Examples of components that implement the interface include: LLMs, Chat Models, Prompts, Retrievers, Tools, Output Parsers.
You can use LangChain Core objects in two ways:
-
imperative, ie. call them directly, eg.
model.invoke(...)
-
declarative, with LangChain Expression Language (LCEL)
-
or a mix of both! eg. one of the steps in your LCEL sequence can be a custom function
Feature | Imperative | Declarative |
---|---|---|
Syntax | All of Python | LCEL |
Tracing | ✅ – Automatic | ✅ – Automatic |
Parallel | ✅ – with threads or coroutines | ✅ – Automatic |
Streaming | ✅ – by yielding | ✅ – Automatic |
Async | ✅ – by writing async functions | ✅ – Automatic |
⚡️ What is LangChain Expression Language?
LangChain Expression Language (LCEL) is a declarative language for composing LangChain Core runnables into sequences (or DAGs), covering the most common patterns when building with LLMs.
LangChain Core compiles LCEL sequences to an optimized execution plan, with automatic parallelization, streaming, tracing, and async support.
For more check out the LCEL docs.
For more advanced use cases, also check out LangGraph, which is a graph-based runner for cyclic and recursive LLM workflows.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file langchain_core_aipy-0.3.47.tar.gz
.
File metadata
- Download URL: langchain_core_aipy-0.3.47.tar.gz
- Upload date:
- Size: 336.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 |
478d88ac22b1a64416bf8b8d34ae9546c73ac39c14fce4eba4c757222f04ca01
|
|
MD5 |
dbfd622fc45c510f6c446673034520a0
|
|
BLAKE2b-256 |
90d5ecdd1fd5602aaf00938eb5fc5877946bb4b2be87651fa8067c8be8a4d08f
|
File details
Details for the file langchain_core_aipy-0.3.47-py3-none-any.whl
.
File metadata
- Download URL: langchain_core_aipy-0.3.47-py3-none-any.whl
- Upload date:
- Size: 416.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 |
473538174a2527f5d6d9f2927a74d7d5236acb6c835c3648dc3587cd12ee8f60
|
|
MD5 |
c0842549f45d0b8d2c5433dfe7bcc706
|
|
BLAKE2b-256 |
369889988018ac8f85ae8c4ef166ea21ba0a80e468bf3f5dac07f008e5f4c5e1
|