No project description provided
Project description
TableGPT Agent
Introduction
tablegpt-agent
is a pre-built agent for TableGPT2 (huggingface), a series of LLMs for table-based question answering. This agent is built on top of the Langgraph library and provides a user-friendly interface for interacting with the TableGPT model.
Installation
To install tablegpt-agent
, use the following command:
pip install tablegpt-agent
This package depends on pybox, a Python code sandbox delegator. By default, pybox
operates in an in-cluster mode. If you wish to run tablegpt-agent
in a local environment, you need to install an optional dependency:
pip install pppybox[local]
Quick Start
Before using tablegpt-agent
, ensure that you have an OpenAI-compatible server set up to host TableGPT2. We recommend using vllm for this:
python -m vllm.entrypoints.openai.api_server --served-model-name TableGPT2-7B --model path/to/weights
Note: For production environments, it’s important to optimize the vllm server configuration. For details, refer to the vllm documentation on server configuration.
Once the server is set up, you can use the following code to interact with the TableGPT model:
from datetime import date
from langchain_core.messages import HumanMessage
from langchain_openai import ChatOpenAI
from tablegpt.agent import create_tablegpt_graph
from pybox import LocalPyBoxManager
llm = ChatOpenAI(openai_api_base=YOUR_VLLM_URL, openai_api_key="whatever", model_name="TableGPT2-7B")
# Use local pybox manager for development and testing
pybox_manager = LocalPyBoxManager()
agent = create_tablegpt_graph(
llm=llm,
pybox_manager=app_state.pybox_manager,
)
message = HumanMessage(content="Hi")
input = {
"messages": [message],
"parent_id": "some-parent-id",
"date": date.today(),
}
async for event in agent.astream_events(
input=input,
version="v2",
):
print(event)
Components
Data Analysis workflow
The Data Analysis workflow is the core functionality of the tablegpt-agent
. It processes user input and generates appropriate responses. This workflow is similar to those found in most single-agent systems and consists of an agent and various tools. Specifically, the data analysis workflow includes:
- An Agent Powered by TableGPT2: This agent performs data analysis tasks.
- An IPython tool: This tool executes the generated code within a sandbox environment.
Additionally, the data analysis workflow offers several optional plugins that extend the agent's functionality:
- VLM: A Visual Language Model that can be used to enhance summarization for data visualization tasks.
- Dataset Retriever: A retriever that fetches information about the dataset, improving the quality and relevance of the generated code.
- Safaty Guard: A safety mechanism that protects the system from toxic inputs.
File Reading workflow
We separate the file reading workflow from the data analysis workflow to maintain greater control over how the LLM inspects the dataset files. Typically, if you let the LLM inspect the dataset itself, it uses the df.head()
function to preview the data. While this is sufficient for basic cases, we have implemented a more structured approach by hard-coding the file reading workflow into several steps:
normalization
(optional): For some Excel files, the content may not be 'pandas-friendly'. We include an optional normalization step to transform the Excel content into a more suitable format for pandas.df.info()
: Unlikedf.head()
,df.info()
provides insights into the dataset's structure, such as the data types of each column and the number of non-null values, which also indicates whether a column contains NaN. This insight helps the LLM understand the structure and quality of the data.df.head()
: The final step displays the first n rows of the dataset, where n is configurable. A larger value for n allows the LLM to glean more information from the dataset; however, too much detail may divert its attention from the primary task.
Code Execution
The tablegpt-agent
directs tablegpt
to generate Python code for data analysis. This code is then executed within a sandbox environment to ensure system security. The execution is managed by the pybox library, which provides a simple way to run Python code outside the main process.
Plugins
tablegpt-agent
provides several plugin interfaces for extending its functionality. These plugins are designed to be easily integrated into the agent and can be used to add new features or modify existing ones. The following plugins are available:
VLM
Dataset Retriever
Safaty Guard
Dataset Normalizer
Liscence
Model Card
For more information about TableGPT2, see the TableGPT2 Model Card.
Citation
If you find our work helpful, please cite us by
@misc{
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file tablegpt_agent-0.2.6.tar.gz
.
File metadata
- Download URL: tablegpt_agent-0.2.6.tar.gz
- Upload date:
- Size: 38.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d8a93002bb7a4c482e43129c3b5d57609641bdf9327a6737793b722b03fba486 |
|
MD5 | 72d983f36b43fea357d5f3e227bf9c3c |
|
BLAKE2b-256 | 3cbaaacce19e7fc041d714edc0764d67a3440fa1327c4c74a0ae603ef86f3a99 |
Provenance
The following attestation bundles were made for tablegpt_agent-0.2.6.tar.gz
:
Publisher:
publish.yml
on tablegpt/tablegpt-agent
-
Statement type:
https://in-toto.io/Statement/v1
- Predicate type:
https://docs.pypi.org/attestations/publish/v1
- Subject name:
tablegpt_agent-0.2.6.tar.gz
- Subject digest:
d8a93002bb7a4c482e43129c3b5d57609641bdf9327a6737793b722b03fba486
- Sigstore transparency entry: 146384146
- Sigstore integration time:
- Predicate type:
File details
Details for the file tablegpt_agent-0.2.6-py3-none-any.whl
.
File metadata
- Download URL: tablegpt_agent-0.2.6-py3-none-any.whl
- Upload date:
- Size: 34.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4ce5a662f2cba831d85421374aacd4c8619190d55be4d72e27b08dd4b00974e8 |
|
MD5 | 4351a328013a33496d0ba49647850a77 |
|
BLAKE2b-256 | 663257959cb942b22767646f4785d778f6eeffc7daa013f2c32a69680e51c3ee |
Provenance
The following attestation bundles were made for tablegpt_agent-0.2.6-py3-none-any.whl
:
Publisher:
publish.yml
on tablegpt/tablegpt-agent
-
Statement type:
https://in-toto.io/Statement/v1
- Predicate type:
https://docs.pypi.org/attestations/publish/v1
- Subject name:
tablegpt_agent-0.2.6-py3-none-any.whl
- Subject digest:
4ce5a662f2cba831d85421374aacd4c8619190d55be4d72e27b08dd4b00974e8
- Sigstore transparency entry: 146384148
- Sigstore integration time:
- Predicate type: