Skip to main content

Building an index of GPT summaries.

Project description

🗂️ ️GPT Index

GPT Index is a project consisting of a set of data structures that are created using LLMs and can be traversed using LLMs in order to answer queries.

PyPi: https://pypi.org/project/gpt-index/.

Documentation: https://gpt-index.readthedocs.io/en/latest/.

🚀 Overview

NOTE: This README is not updated as frequently as the documentation. Please check out the documentation above for the latest updates!

Context

  • LLMs are a phenomenal piece of technology for knowledge generation and reasoning.
  • A big limitation of LLMs is context size (e.g. OpenAI's davinci model for GPT-3 has a limit of 4096 tokens. Large, but not infinite).
  • The ability to feed "knowledge" to LLMs is restricted to this limited prompt size and model weights.
  • Thought: What if LLMs can have access to potentially a much larger database of knowledge without retraining/finetuning?

Proposed Solution

That's where the GPT Index comes in. GPT Index is a simple, flexible interface between your external data and LLMs. It resolves the following pain points:

  • Provides simple data structures to resolve prompt size limitations.
  • Offers data connectors to your external data sources.
  • Offers you a comprehensive toolset trading off cost and performance.

At the core of GPT Index is a data structure. Instead of relying on world knowledge encoded in the model weights, a GPT Index data structure does the following:

  • Uses a pre-trained LLM primarily for reasoning/summarization instead of prior knowledge.
  • Takes as input a large corpus of text data and build a structured index over it (using an LLM or heuristics).
  • Allow users to query the index in order to synthesize an answer to the question - this requires both traversal of the index as well as a synthesis of the answer.

💡 Contributing

Interesting in Contributing? See our Contribution Guide for more details.

📄 Documentation

Full documentation can be found here: https://gpt-index.readthedocs.io/en/latest/.

Please check it out for the most up-to-date tutorials, how-to guides, references, and other resources!

💻 Example Usage

pip install gpt-index

Examples are in the examples folder. Indices are in the indices folder (see list of indices below).

To build a tree index do the following:

from gpt_index import GPTTreeIndex, SimpleDirectoryReader
documents = SimpleDirectoryReader('data').load_data()
index = GPTTreeIndex(documents)

To save to disk and load from disk, do

# save to disk
index.save_to_disk('index.json')
# load from disk
index = GPTTreeIndex.load_from_disk('index.json')

To query,

index.query("<question_text>?", child_branch_factor=1)

🔧 Dependencies

The main third-party package requirements are tiktoken, openai, and langchain.

All requirements should be contained within the setup.py file. To run the package locally without building the wheel, simply do pip install -r requirements.txt.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gpt_index-0.1.18.tar.gz (76.2 kB view details)

Uploaded Source

File details

Details for the file gpt_index-0.1.18.tar.gz.

File metadata

  • Download URL: gpt_index-0.1.18.tar.gz
  • Upload date:
  • Size: 76.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for gpt_index-0.1.18.tar.gz
Algorithm Hash digest
SHA256 bf8900a0b1bec51d182a4c08cf359b6250b003b5de9c296757544cb7dc416394
MD5 99f6e84d0b1511c2cc70e8e6af20cdda
BLAKE2b-256 9b805138b9bdb4b5d3eacccd8420d2fd02a8aed174c39303b6d6e1d80c6cdc43

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page