Skip to main content

AI Toolkit for Engineers

Project description

phidata

AI Toolkit for Engineers

version pythonversion downloads build-status


๐Ÿงฐ Phidata is a everything-included toolkit for building products using LLMs


It solves the problem of building LLM applications by providing:

๐Ÿ’ป Software layer

  • Components for building LLM apps: RAG, Agents, Workflows
  • Components for extending LLM apps: VectorDbs, Storage, Memory, Cache
  • Components for monitoring LLM apps: Model Inputs/Outputs, Quality, Cost
  • Components for improving LLM apps: Fine-tuning, RLHF
  • Components for securing LLM apps: I/O Validation, Guardrails

๐Ÿ“ฑ Application layer

  • Tools for serving LLM apps: FastApi, Django, Streamlit
  • Tools for serving LLM components: PgVector, Postgres, Redis

๐ŸŒ‰ Infrastructure layer

  • Infrastructure for running LLM apps locally: Docker
  • Infrastructure for running LLM apps in production: AWS
  • Best practices like testing, formatting, CI/CD, security and secret management.

Our goal is to integrate the 3 layers of software development using 1 toolkit and build production-grade LLM Apps.

๐Ÿš€ How it works

  • Create your LLM app from a template using phi ws create
  • Run your app locally using phi ws up dev:docker
  • Run your app on AWS using phi ws up prd:aws

๐ŸŽฏ For more information:

๐Ÿ‘ฉโ€๐Ÿ’ป Quickstart: Build a LLM App ๐Ÿง‘โ€๐Ÿ’ป

Let's build a LLM App with GPT-4 using PgVector for Knowledge Base and Storage. We'll serve the app using Streamlit and FastApi, running locally on Docker.

Install docker desktop before moving ahead

Setup

Open the Terminal and create a python virtual environment

python3 -m venv ~/.venvs/llmenv
source ~/.venvs/llmenv/bin/activate

Install phidata

pip install phidata

Create your codebase

Create your codebase using the llm-app template that is pre-configured with FastApi, Streamlit and PgVector. Use this codebase as a starting point for your LLM product.

phi ws create -t llm-app -n llm-app
create-llm-app

This will create a folder named llm-app with the following structure:

llm-app
โ”œโ”€โ”€ api               # directory for FastApi routes
โ”œโ”€โ”€ app               # directory for Streamlit apps
โ”œโ”€โ”€ db                # directory for database components
โ”œโ”€โ”€ llm               # directory for LLM components
    โ”œโ”€โ”€ conversations       # LLM conversations
    โ”œโ”€โ”€ knowledge_base.py   # LLM knowledge base
    โ””โ”€โ”€ storage.py          # LLM storage
โ”œโ”€โ”€ notebooks         # directory for Jupyter notebooks
โ”œโ”€โ”€ Dockerfile        # Dockerfile for the application
โ”œโ”€โ”€ pyproject.toml    # python project definition
โ”œโ”€โ”€ requirements.txt  # python dependencies generated by pyproject.toml
โ”œโ”€โ”€ scripts           # directory for helper scripts
โ”œโ”€โ”€ utils             # directory for shared utilities
โ””โ”€โ”€ workspace
    โ”œโ”€โ”€ dev_resources.py  # Dev resources running locally
    โ”œโ”€โ”€ prd_resources.py  # Production resources running on AWS
    โ”œโ”€โ”€ jupyter           # Jupyter notebook resources
    โ”œโ”€โ”€ secrets           # directory for storing secrets
    โ””โ”€โ”€ settings.py       # Phidata workspace settings

Set OpenAI Key

Set the OPENAI_API_KEY environment variable. You can get one from OpenAI here.

export OPENAI_API_KEY=sk-***

Serve you LLM App using Streamlit

Streamlit allows us to build micro front-ends for our LLM App and is extremely useful for building basic applications in pure python. Start the app group using:

phi ws up --group app
run-llm-app

Press Enter to confirm and give a few minutes for the image to download (only the first time). Verify container status and view logs on the docker dashboard.

Open localhost:8501 to view streamlit apps that you can customize and make your own.

Chat with PDFs

  • Click on Chat with PDFs in the sidebar
  • Enter a username and wait for the knowledge base to load.
  • Choose between RAG or Autonomous mode.
  • Ask "How do I make chicken tikka salad?"
  • The streamlit apps are defined in the app folder.
  • The Conversations powering these apps are defined in the llm/conversations folder.
  • The Streamlit application is defined in the workspace/dev_resources.py file.
chat-with-pdf

Serve your LLM App using FastApi

Streamlit is great for building micro front-ends but any production application will be built using a front-end framework like next.js backed by a RestApi built with a framework like FastApi.

Your LLM App comes pre-configured with FastApi, start the api group using:

phi ws up --group api

Press Enter to confirm and give a few minutes for the image to download.

View API Endpoints

  • Open localhost:8000/docs to view the API Endpoints.
  • Test the v1/pdf/conversation/chat endpoint with {"message": "how do I make chicken tikka salad"}
  • Checkout the api/routes/pdf_routes.py file for endpoints that you can integrate with your front-end or product.
  • The FastApi application is defined in the workspace/dev_resources.py file.

Delete local resources

Play around and stop the workspace using:

phi ws down

Run your LLM App on AWS

Read how to run your LLM App on AWS here.

More information:

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

phidata-2.0.20.tar.gz (293.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

phidata-2.0.20-py3-none-any.whl (433.0 kB view details)

Uploaded Python 3

File details

Details for the file phidata-2.0.20.tar.gz.

File metadata

  • Download URL: phidata-2.0.20.tar.gz
  • Upload date:
  • Size: 293.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.18

File hashes

Hashes for phidata-2.0.20.tar.gz
Algorithm Hash digest
SHA256 215f0c18c4ee58fcfb11b2b53ece811eaaedc27f1fdc5991453429b3f3c4b028
MD5 093a492faa01dd820136f74390e040a9
BLAKE2b-256 40e99d837d670088a48c38a6201197548ea358d408cea545046a65da5322dd95

See more details on using hashes here.

File details

Details for the file phidata-2.0.20-py3-none-any.whl.

File metadata

  • Download URL: phidata-2.0.20-py3-none-any.whl
  • Upload date:
  • Size: 433.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.18

File hashes

Hashes for phidata-2.0.20-py3-none-any.whl
Algorithm Hash digest
SHA256 5a2b62ecae7228b45f0fbbec9d315b305a9b438ddbb3e10b01b714a4007f1e28
MD5 516a4f4867d7e699f5d73155264c4766
BLAKE2b-256 2415ff2068f7311ca04e3141f0b623443b6bddd5b0b038cbc5b1f48e67fdc5b0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page