A High-efficiency Open-source Toolkit for Table-to-Latex Transformation
Project description
StructEqTable-Deploy: A High-efficiency Open-source Toolkit for Table-to-Latex Transformation
[ Related Paper ] [ Website ] [ Dataset (Google Drive)] [ Dataset (Hugging Face) ]
Welcome to the official repository of StructEqTable-Deploy, a solution that converts images of Table into LaTeX, powered by scalable data from DocGenome benchmark.
Abstract
Table is an effective way to represent structured data in scientific publications, financial statements, invoices, web pages, and many other scenarios. Extracting tabular data from a visual table image and performing the downstream reasoning tasks according to the extracted data is challenging, mainly due to that tables often present complicated column and row headers with spanning cell operation. To address these challenges, we present TableX, a large-scale multi-modal table benchmark extracted from DocGenome benchmark for table pre-training, comprising more than 2 million high-quality Image-LaTeX pair data covering 156 disciplinary classes. Besides, benefiting from such large-scale data, we train an end-to-end model, StructEqTable, which provides the capability to precisely obtain the corresponding LaTeX description from a visual table image and perform multiple table-related reasoning tasks, including structural extraction and question answering, broadening its application scope and potential.
TODO
- Release inference code and checkpoints of StructEqTable.
- Support Chinese version of StructEqTable.
- Improve the inference speed of StructEqTable.
Installation
conda create -n structeqtable python=3.9
conda activate structeqtable
pip install "git+https://github.com/UniModal4Reasoning/StructEqTable-Deploy.git"
Quick Demo
- run the demo/demo.py
cd demo
python demo.py \ --image_path ./demo.png \
--ckpt_path ${CKPT_PATH}
- Visualization Results
- The input data are sampled from SciHub domain.
Acknowledgements
- DocGenome. An Open Large-scale Scientific Document Benchmark for Training and Testing Multi-modal Large Models.
- ChartVLM. A Versatile Benchmark and Foundation Model for Complicated Chart Reasoning.
- Pix2Struct. Screenshot Parsing as Pretraining for Visual Language Understanding.
- UniMERNet. A Universal Network for Real-World Mathematical Expression Recognition.
- Donut. The UniMERNet's Transformer Encoder-Decoder are referenced from Donut.
- Nougat. The tokenizer uses Nougat.
License
Citation
If you find our models / code / papers useful in your research, please consider giving ⭐ and citations 📝, thx :)
@article{xia2024docgenome,
title={DocGenome: An Open Large-scale Scientific Document Benchmark for Training and Testing Multi-modal Large Language Models},
author={Xia, Renqiu and Mao, Song and Yan, Xiangchao and Zhou, Hongbin and Zhang, Bo and Peng, Haoyang and Pi, Jiahao and Fu, Daocheng and Wu, Wenjie and Ye, Hancheng and others},
journal={arXiv preprint arXiv:2406.11633},
year={2024}
}
Contact Us
If you encounter any issues or have questions, please feel free to contact us via zhouhongbin@pjlab.org.cn.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file struct_eqtable-0.1.0.tar.gz
.
File metadata
- Download URL: struct_eqtable-0.1.0.tar.gz
- Upload date:
- Size: 8.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.9.18
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ec8fe53096ba47b0e3f61f8692897ddb5bb3ec1fc9f67870485dad7dc37c61b3 |
|
MD5 | c778c36ba113a13e264ed1daf53d4486 |
|
BLAKE2b-256 | 893cf26b111e83d855a820f1238e3e638f84914423551d8f75b9731e72596646 |
File details
Details for the file struct_eqtable-0.1.0-py3-none-any.whl
.
File metadata
- Download URL: struct_eqtable-0.1.0-py3-none-any.whl
- Upload date:
- Size: 8.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.9.18
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5da1bdaa90651fafd9072fd65b4c799887c316cfc6189502acf326b1d574332a |
|
MD5 | 6911af5438be32cde38fd1f404666416 |
|
BLAKE2b-256 | a425d1e91b2ad2727c9ecb332607729a03c2f0f345afd2547f4100e543330f0e |