AI Toolkit for Engineers
Project description
phidata
Build, ship and monitor AI products
✨ What is phidata?
Phidata is an OSS toolkit for building AI products.
It gives you production-ready AI Apps with 1 command.
Its goal is to provide a paved-path for building AI products, for anyone with basic python skills.
🎖 Use it to build
- AI Apps (RAG, autonomous or multimodal applications)
- AI Assistants (automate data engineering, python or snowflake tasks)
- Rest Apis (with FastApi, PostgreSQL)
- Web Apps (with Django, PostgreSQL)
- Data Platforms (with Airflow, Superset, Jupyter)
💡 What you get
Production ready codebases built with:
- Building blocks like conversations, agents, knowledge bases defined as pydantic objects
- Applications like FastApi, Streamlit, Django, Postgres defined as pydantic objects
- Infrastructure components (docker, AWS) also defined as pydantic objects
Phidata applications run locally using docker and can be deployed to AWS with 1 command.
👩💻 How it works
- Create your codebase using a template:
phi ws create
- Run your app locally:
phi ws up dev:docker
- Run your app on AWS:
phi ws up prd:aws
📚 More Information:
- Read the documentation
- Chat with us on Discord
- Email us at help@phidata.com
🚀 Quickstart: Build a RAG LLM App
Let's build a RAG LLM App with GPT-4. We'll use:
- Streamlit for the front-end
- FastApi for the back-end
- PgVector for Knowledge Base and Storage
- Read the full tutorial here.
Install docker desktop to run this app locally.
Create a virtual environment
Open the Terminal
and create an ai
directory with a python virtual environment.
mkdir ai && cd ai
python3 -m venv aienv
source aienv/bin/activate
Install
Install phidata
pip install -U phidata
Create your codebase
Create your codebase using the llm-app
template pre-configured with FastApi, Streamlit and PgVector.
phi ws create -t llm-app -n llm-app
This will create a folder llm-app
with a pre-built LLM App that you can customize and make your own.
Serve your LLM App using Streamlit
Streamlit allows us to build micro front-ends for our LLM App and is extremely useful for building basic applications in pure python. Start the app
group using:
phi ws up --group app
Press Enter to confirm and give a few minutes for the image to download.
Chat with PDFs
- Open localhost:8501 to view streamlit apps that you can customize and make your own.
- Click on Chat with PDFs in the sidebar
- Enter a username and wait for the knowledge base to load.
- Choose the
RAG
Conversation type. - Ask "How do I make chicken curry?"
- Upload PDFs and ask questions
Serve your LLM App using FastApi
Streamlit is great for building micro front-ends but any production application will be built using a front-end framework like next.js
backed by a RestApi built using a framework like FastApi
.
Your LLM App comes ready-to-use with FastApi endpoints, start the api
group using:
phi ws up --group api
Press Enter to confirm and give a few minutes for the image to download.
View API Endpoints
- Open localhost:8000/docs to view the API Endpoints.
- Load the knowledge base using
/v1/pdf/conversation/load-knowledge-base
- Test the
v1/pdf/conversation/chat
endpoint with{"message": "How do I make chicken curry?"}
- The LLM Api comes pre-built with endpoints that you can integrate with your front-end.
Optional: Run Jupyterlab
A jupyter notebook is a must-have for AI development and your llm-app
comes with a notebook pre-installed with the required dependencies. Enable it by updating the workspace/settings.py
file:
...
ws_settings = WorkspaceSettings(
...
# Uncomment the following line
dev_jupyter_enabled=True,
...
Start jupyter
using:
phi ws up --group jupyter
Press Enter to confirm and give a few minutes for the image to download (only the first time). Verify container status and view logs on the docker dashboard.
View Jupyterlab UI
- Open localhost:8888 to view the Jupyterlab UI. Password: admin
- Play around with cookbooks in the
notebooks
folder.
Delete local resources
Play around and stop the workspace using:
phi ws down
Run your LLM App on AWS
Read how to run your LLM App on AWS.
📚 More Information:
- Read the documentation
- Chat with us on Discord
- Email us at help@phidata.com
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.