Uniform interface to deep learning approaches via Docker containers.
This is a library designed to provide a uniform interface to various deep learning models for text via programmatically created Docker containers.
See the docs for prerequisites, a quickstart, and the API reference. In brief, you need Docker installed with appropriate permissions for your user account to run Docker commands and Python 3.7. Then run the following:
pip install gobbli
You may also want to check out the benchmarks to see some comparisons of gobbli's implementation of various models in different situations.
gobbli provides streamlit apps to perform some interactive tasks in a web browser, such as data exploration and model evaluation. Once you've installed the library, you can run the bundled apps using the
gobbli command line application. Check the docs for more information.
Assuming you have all prerequisites noted above, you need to install the package and all required + optional dependencies in development mode:
pip install -e ".[augment,tokenize,interactive]"
Install additional dev dependencies:
pip install -r requirements.txt
Run linting, autoformatting, and tests:
If you're running tests in an environment with less than 12GB of memory, you'll want to pass the
--low-resource argument when running tests to avoid out of memory errors.
NOTE: If running on a Mac, even with adequate memory available, you may encounter Out of Memory errors (exit status 137) when running the tests. This is due to not enough memory being allocated to your Docker daemon. Try going to Docker for Mac -> Preferences -> Advanced and raising "Memory" to 12GiB or more.
If you want to run the tests GPU(s) enabled, see the
--nvidia-visible-devices arguments under
py.test --help. If your local machine doesn't have an NVIDIA GPU, but you have access to one that does via SSH, you can use the
test_remote_gpu.sh script to run the tests with GPU enabled over SSH.
To generate the docs, install the docs requirements:
pip install -r docs/requirements.txt
Since doc structure is auto-generated from the library, you must have the library (and all its dependencies) installed as well.
Then, run the following from the repository root:
Then browse the generated documentation in
gobbli wouldn't exist without the public release of several state-of-the-art models. The library incorporates:
- BERT, released by Google
- MT-DNN, released by Microsoft
- Universal Sentence Encoder, released by Google
- fastText, released by Facebook
- transformers, released by Hugging Face
- spaCy, by Explosion
Original work on the library was funded by RTI International.
Logo design by Marcia Underwood.
Release history Release notifications | RSS feed
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size gobbli-0.2.3-py3-none-any.whl (256.1 kB)||File type Wheel||Python version py3||Upload date||Hashes View|
|Filename, size gobbli-0.2.3.tar.gz (200.4 kB)||File type Source||Python version None||Upload date||Hashes View|