Skip to main content

A flexible multimodal AI library for advanced contextual understanding and deployment.

Project description

CapibaraENT CLI

Capibara SSBD Model

CapibaraENT is a command-line tool for training, evaluating, and deploying Capibara-based language models, optimized for TPUs and featuring hyperparameter optimization.

Features

  • Training and evaluation of Capibara models
  • Built-in TPU support
  • Hyperparameter optimization
  • Model deployment
  • Performance measurement
  • Docker container execution (optional)
  • Integration with Weights & Biases for experiment tracking
  • New layers and sub-models: Support for the latest modeling layers and advanced sub-models.

Requirements

  • Python 3.7+
  • JAX (for TPU optimization)
  • TensorFlow
  • Weights & Biases
  • Docker (optional, for container execution)

Installation

  1. Clone this repository:

    git clone https://github.com/anachroni-io/capibaraent-cli.git
    cd capibaraent-cli
    
  2. Install dependencies:

    pip install -r requirements.txt
    
  3. Set up Weights & Biases:

    wandb login
    

Usage

The CapibaraENT CLI offers various options for working with Capibara models:

python capibaraent_cli.py [options]

Available options

  • --log-level: Logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL)
  • --train: Train the model
  • --evaluate: Evaluate the model
  • --optimize: Perform hyperparameter optimization
  • --use-docker: Run the model inside Docker (optional, commented)
  • --deploy: Deploy the model
  • --measure-performance: Measure the model's performance
  • --model: Path to the model YAML file (for deserialization)
  • --new-layer: (optional) Activate new modeling layers
  • --sub-model: (optional) Specify sub-models to use

Usage Examples

  1. Train a model:

    python capibaraent_cli.py --train
    
  2. Evaluate a model:

    python capibaraent_cli.py --evaluate
    
  3. Perform hyperparameter optimization:

    python optimize_hyperparameters.py
    
  4. Deploy a model:

    python capibaraent_cli.py --deploy
    
  5. Measure model performance:

    python capibaraent_cli.py --measure-performance
    
  6. Run a model in Docker (optional, if Docker is set up):

    python capibaraent_cli.py --use-docker
    

Configuration

Model configuration is handled through environment variables and YAML files. Key configuration parameters include:

  • CAPIBARA_LEARNING_RATE
  • CAPIBARA_BATCH_SIZE
  • CAPIBARA_MAX_LENGTH
  • CAPIBARA_USE_TPU
  • WANDB_PROJECT
  • WANDB_ENTITY
  • CAPIBARA_NEW_LAYER (new layer)
  • CAPIBARA_SUB_MODEL (sub-model)

Example .env file

CAPIBARA_LEARNING_RATE=0.001
CAPIBARA_BATCH_SIZE=32
CAPIBARA_MAX_LENGTH=512
CAPIBARA_USE_TPU=True
WANDB_PROJECT=my_project
WANDB_ENTITY=my_entity
CAPIBARA_NEW_LAYER=True
CAPIBARA_SUB_MODEL=my_sub_model

For a full list of configuration options, refer to the .env.example file.

Hyperparameter Optimization

To perform hyperparameter optimization:

  1. Ensure your Weights & Biases project is set up.

  2. Run the optimization script:

    python optimize_hyperparameters.py
    
  3. View the results in your Weights & Biases dashboard.

Development

To contribute to the project:

  1. Fork the repository
  2. Create a new branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

Distributed under the MIT License. See LICENSE for more information.

Contact

Marco Durán - marco@anachroni.co

Project Link: https://github.com/anachroni-io/capibaraent-cli

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

capibara_ent-1.2.1.tar.gz (28.3 kB view details)

Uploaded Source

Built Distribution

capibara_ent-1.2.1-py3-none-any.whl (38.4 kB view details)

Uploaded Python 3

File details

Details for the file capibara_ent-1.2.1.tar.gz.

File metadata

  • Download URL: capibara_ent-1.2.1.tar.gz
  • Upload date:
  • Size: 28.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.12

File hashes

Hashes for capibara_ent-1.2.1.tar.gz
Algorithm Hash digest
SHA256 251242483c1586ef2013d647bf48d29df17271a4c2d9e176daecee1527a0a380
MD5 8e56d742c19aafa1f4795b3e9250d729
BLAKE2b-256 a558ebad67149362da809553c0d7beb8e31fa750b991f6a200d1f56d13629a74

See more details on using hashes here.

File details

Details for the file capibara_ent-1.2.1-py3-none-any.whl.

File metadata

  • Download URL: capibara_ent-1.2.1-py3-none-any.whl
  • Upload date:
  • Size: 38.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.12

File hashes

Hashes for capibara_ent-1.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 0a42255fa8935fe042b314a81a751d4406d3666e1781f5e6a621690b4dc004a1
MD5 3d126efa8d8f68f16d8c7e8180301bd6
BLAKE2b-256 e764b6aa8d839b5c1614dfd9cee1cb0daaea7a6a04da25c22fcb0fe6597fe848

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page