Skip to main content

Official implementation for HYDRA.

Project description

HYDRA: A Hyper Agent for Dynamic Compositional Visual Reasoning

This is the code for the paper HYDRA: A Hyper Agent for Dynamic Compositional Visual Reasoning, accepted by ECCV 2024 [Project Page].

Release

  • [2024/08/05] 🚀 PYPI package is released.
  • [2024/07/29] 🔥 HYDRA is open sourced in GitHub.

TODOs

We realize that gpt-3.5-turbo-0613 is deprecated, and gpt-3.5 will be replaced by gpt-4o-mini. We will release another version of HYDRA.

As of July 2024, gpt-4o-mini should be used in place of gpt-3.5-turbo, as it is cheaper, more capable, multimodal, and just as fast Openai API Page.

We also notice the embedding model is updated by OpenAI as shown in this link. Due to the uncertainty of the embedding model updates from OpenAI, we suggest you train a new version of the RL controller yourself and update the RL models.

  • GPT-4o-mini replacement.
  • LLaMA3.1 (ollama) replacement.
  • Gradio Demo
  • GPT-4o Version.
  • HYDRA with RL

Installation

Requirements

  • Python >= 3.10
  • conda

Please follow the instructions below to install the required packages and set up the environment.

1. Clone this repository.

git clone https://github.com/ControlNet/HYDRA

2. Setup conda environment and install dependencies.

From source:

bash -i build_env.sh

If you meet errors, please consider going through the build_env.sh file and install the packages manually.

From PyPI:

# run after you have installed the conda environment, pytorch and cuda
pip install "hydra_vl4ai"
AM_I_DOCKER=False BUILD_WITH_CUDA=True CUDA_HOME=$CONDA_PREFIX pip install --no-build-isolation "git+https://github.com/ControlNet/HYDRA.git#subdirectory=module_repos/GLIP"
AM_I_DOCKER=False BUILD_WITH_CUDA=True CUDA_HOME=$CONDA_PREFIX pip install --no-build-isolation "git+https://github.com/ControlNet/HYDRA.git#subdirectory=module_repos/Grounded-Segment-Anything/GroundingDINO"
AM_I_DOCKER=False BUILD_WITH_CUDA=True CUDA_HOME=$CONDA_PREFIX pip install --no-build-isolation "git+https://github.com/ControlNet/HYDRA.git#subdirectory=module_repos/Grounded-Segment-Anything/segment_anything"
AM_I_DOCKER=False BUILD_WITH_CUDA=True CUDA_HOME=$CONDA_PREFIX pip install --no-build-isolation "git+https://github.com/ControlNet/HYDRA.git#subdirectory=module_repos/LLaVA"

3. Configure the environments

Edit the file .env or setup in CLI to configure the environment variables.

OPENAI_API_KEY=your-api-key
OLLAMA_HOST=http://ollama.server:11434
# do not change this TORCH_HOME variable
TORCH_HOME=./pretrained_models

4. Download the pretrained models

Run the scripts to download the pretrained models to the ./pretrained_models directory.

python -m hydra_vl4ai.download_models --base_config <EXP-CONFIG-DIR> --model_config <MODEL-CONFIG-PATH>

For example,

python -m hydra_vl4ai.download_models --base_config ./config/okvqa.yaml --model_config ./configs/model_config_1gpu.yaml

Inference

A worker is required to run the inference.

python -m hydra_vl4ai.executor --base_config <EXP-CONFIG-DIR> --model_config <MODEL-CONFIG-PATH>

Inference with given one image and prompt

python demo_cli.py \
  --image <IMAGE_PATH> \
  --prompt <PROMPT> \
  --base_config <YOUR-CONFIG-DIR> \
  --model_config <MODEL-PATH>

Inference with Gradio GUI

TODO.

Inference dataset

python main.py \
  --data_root <YOUR-DATA-ROOT> \
  --base_config <YOUR-CONFIG-DIR> \
  --model_config <MODEL-PATH>

Then the inference results are saved in the ./result directory for evaluation.

Evaluation

python evaluate.py <RESULT_JSON_PATH> <DATASET_NAME>

For example,

python evaluate.py result/result_okvqa.jsonl okvqa

Citation

@inproceedings{ke2024hydra,
  title={HYDRA: A Hyper Agent for Dynamic Compositional Visual Reasoning},
  author={Fucai Ke and Zhixi Cai and Simindokht Jahangard and Weiqing Wang and Pari Delir Haghighi and Hamid Rezatofighi},
  booktitle={European Conference on Computer Vision},
  year={2024},
  organization={Springer}
}

Acknowledgements

Some code and prompts are based on cvlab-columbia/viper.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hydra_vl4ai-0.0.1.tar.gz (104.0 kB view details)

Uploaded Source

Built Distribution

hydra_vl4ai-0.0.1-py3-none-any.whl (138.4 kB view details)

Uploaded Python 3

File details

Details for the file hydra_vl4ai-0.0.1.tar.gz.

File metadata

  • Download URL: hydra_vl4ai-0.0.1.tar.gz
  • Upload date:
  • Size: 104.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for hydra_vl4ai-0.0.1.tar.gz
Algorithm Hash digest
SHA256 4fa9b401b8f3093f277a2dc56f215d4010f6d660dd7232ffc7ca04e16d3928b6
MD5 9ec4f1215fbe06e4e063857ead8e6df1
BLAKE2b-256 66a8e291b9dd221d6c8f631e347c8efece16369145205073f92a1449c6dfc394

See more details on using hashes here.

File details

Details for the file hydra_vl4ai-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: hydra_vl4ai-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 138.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for hydra_vl4ai-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 8a5afded0eb1e1a00a984f0f4954f8d632fcc1b088140b5195de8a399a5c05e4
MD5 061fb59b1a0319417d70f9cd5b0ebd51
BLAKE2b-256 908843bb7afdf1f55d1f72eb9f45d017646649474fc959cdb43d11911582079f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page