Skip to main content

AI workflows in your PostgreSQL database

Project description

pgai

pgai allows you to develop RAG, semantic search, and other AI applications directly in PostgreSQL

Discord Try Timescale for free

pgai simplifies the process of building search, Retrieval Augmented Generation (RAG), and other AI applications with PostgreSQL.

Overview

The goal of pgai is to make working with AI easier and more accessible to developers. Because data is the foundation of most AI applications, pgai makes it easier to leverage your data in AI workflows. In particular, pgai supports:

Working with embeddings generated from your data:

  • Automatically create and sync vector embeddings for your data (learn more)
  • Search your data using vector and semantic search (learn more)
  • Implement Retrieval Augmented Generation inside a single SQL statement (learn more)
  • Perform high-performance, cost-efficient ANN search on large vector workloads with pgvectorscale, which complements pgvector.

Leverage LLMs for data processing tasks:

  • Retrieve LLM chat completions from models like Claude Sonnet 3.5, OpenAI GPT4o, Cohere Command, and Llama 3 (via Ollama). (learn more)
  • Reason over your data and facilitate use cases like classification, summarization, and data enrichment on your existing relational data in PostgreSQL (see an example).

Learn more about pgai: To learn more about the pgai extension and why we built it, read pgai: Giving PostgreSQL Developers AI Engineering Superpowers.

Contributing: We welcome contributions to pgai! See the Contributing page for more information.

Getting Started

Here's how to get started with pgai:

For a quick start, try out automatic data embedding using pgai Vectorizer:

For other use cases, first Install pgai in Timescale Cloud, a pre-built Docker image, or from source. Then, choose your own adventure:

  • Automate AI embedding with pgai Vectorizer.
  • Use pgai to integrate AI from your provider. Some examples:
    • Ollama - configure pgai for Ollama, then use the model to embed, chat complete and generate.
    • OpenAI - configure pgai for OpenAI, then use the model to tokenize, embed, chat complete and moderate. This page also includes advanced examples.
    • Anthropic - configure pgai for Anthropic, then use the model to generate content.
    • Cohere - configure pgai for Cohere, then use the model to tokenize, embed, chat complete, classify, and rerank.
  • Leverage LLMs for data processing tasks such as classification, summarization, and data enrichment (see the OpenAI example).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pgai-0.2.1.tar.gz (4.1 MB view details)

Uploaded Source

Built Distribution

pgai-0.2.1-py3-none-any.whl (3.8 MB view details)

Uploaded Python 3

File details

Details for the file pgai-0.2.1.tar.gz.

File metadata

  • Download URL: pgai-0.2.1.tar.gz
  • Upload date:
  • Size: 4.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for pgai-0.2.1.tar.gz
Algorithm Hash digest
SHA256 463952ce86b1d3a87b23b651ff442ec116156664d93b374b351b65ac790f5011
MD5 bb9051e930acbf4dced74126a02b35d3
BLAKE2b-256 4a360fdf88cc09dee711772f20e16a99aa4707050936efba4380aa1731dea30e

See more details on using hashes here.

Provenance

The following attestation bundles were made for pgai-0.2.1.tar.gz:

Publisher: release-please.yml on timescale/pgai

Attestations:

File details

Details for the file pgai-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: pgai-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 3.8 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for pgai-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 7c8e30362256fe71e6166ceb60ba11e4a6c300ff7e41aa8dfd8f4bb71b34cacf
MD5 2c70e78d5083f2dd510493d9bf674360
BLAKE2b-256 1079598eca7302e96feba222628f0881a225b47aa63565c2c89838cf779a0eaf

See more details on using hashes here.

Provenance

The following attestation bundles were made for pgai-0.2.1-py3-none-any.whl:

Publisher: release-please.yml on timescale/pgai

Attestations:

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page