AI workflows in your PostgreSQL database
Project description
pgai simplifies the process of building search, Retrieval Augmented Generation (RAG), and other AI applications with PostgreSQL.
Overview
The goal of pgai is to make working with AI easier and more accessible to developers. Because data is the foundation of most AI applications, pgai makes it easier to leverage your data in AI workflows. In particular, pgai supports:
Working with embeddings generated from your data:
- Automatically create and sync vector embeddings for your data (learn more)
- Search your data using vector and semantic search (learn more)
- Implement Retrieval Augmented Generation inside a single SQL statement (learn more)
- Perform high-performance, cost-efficient ANN search on large vector workloads with pgvectorscale, which complements pgvector.
Leverage LLMs for data processing tasks:
- Retrieve LLM chat completions from models like Claude Sonnet 3.5, OpenAI GPT4o, Cohere Command, and Llama 3 (via Ollama). (learn more)
- Reason over your data and facilitate use cases like classification, summarization, and data enrichment on your existing relational data in PostgreSQL (see an example).
Learn more about pgai: To learn more about the pgai extension and why we built it, read pgai: Giving PostgreSQL Developers AI Engineering Superpowers.
Contributing: We welcome contributions to pgai! See the Contributing page for more information.
Getting Started
Here's how to get started with pgai:
For a quick start, try out automatic data embedding using pgai Vectorizer:
- Try our cloud offering by creating a free trial account and heading over to our pgai Vectorizer documentation.
- or check out our quick start guide to get up and running in less than 10 minutes with a self-hosted Postgres instance.
For other use cases, first Install pgai in Timescale Cloud, a pre-built Docker image, or from source. Then, choose your own adventure:
- Automate AI embedding with pgai Vectorizer.
- Use pgai to integrate AI from your provider. Some examples:
- Ollama - configure pgai for Ollama, then use the model to embed, chat complete and generate.
- OpenAI - configure pgai for OpenAI, then use the model to tokenize, embed, chat complete and moderate. This page also includes advanced examples.
- Anthropic - configure pgai for Anthropic, then use the model to generate content.
- Cohere - configure pgai for Cohere, then use the model to tokenize, embed, chat complete, classify, and rerank.
- Leverage LLMs for data processing tasks such as classification, summarization, and data enrichment (see the OpenAI example).
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file pgai-0.2.0.tar.gz
.
File metadata
- Download URL: pgai-0.2.0.tar.gz
- Upload date:
- Size: 4.1 MB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d3a7bdec3a094f56706055c8448a6b64414e1f81f16cff4b3d5f27b220fb78cc |
|
MD5 | cc8588b44873e421ac3616ba1c55eb0f |
|
BLAKE2b-256 | 320905dbb63d24c404332033adec7d3a7afcb06e5bc325c13695c87659b45330 |
Provenance
The following attestation bundles were made for pgai-0.2.0.tar.gz
:
Publisher:
release-please.yml
on timescale/pgai
-
Statement type:
https://in-toto.io/Statement/v1
- Predicate type:
https://docs.pypi.org/attestations/publish/v1
- Subject name:
pgai-0.2.0.tar.gz
- Subject digest:
d3a7bdec3a094f56706055c8448a6b64414e1f81f16cff4b3d5f27b220fb78cc
- Sigstore transparency entry: 151724100
- Sigstore integration time:
- Predicate type:
File details
Details for the file pgai-0.2.0-py3-none-any.whl
.
File metadata
- Download URL: pgai-0.2.0-py3-none-any.whl
- Upload date:
- Size: 3.8 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e7f5da0f33dfdd16b7e7db1be1e0ea6c97c627464d2013a80f45c3f951730866 |
|
MD5 | eb75437eeb55b8bca6249605500121f4 |
|
BLAKE2b-256 | b15efe71469d846bc95c2a408415ec469d98910896bfe6f6014d6f64a0ec6964 |
Provenance
The following attestation bundles were made for pgai-0.2.0-py3-none-any.whl
:
Publisher:
release-please.yml
on timescale/pgai
-
Statement type:
https://in-toto.io/Statement/v1
- Predicate type:
https://docs.pypi.org/attestations/publish/v1
- Subject name:
pgai-0.2.0-py3-none-any.whl
- Subject digest:
e7f5da0f33dfdd16b7e7db1be1e0ea6c97c627464d2013a80f45c3f951730866
- Sigstore transparency entry: 151724102
- Sigstore integration time:
- Predicate type: