An example Python package
Project description
Image-Recognition-App
This app is an example of a pure MLOps probelm solving. The purpose of the app is to input an image and via a MobileNet predict the objects in it. Originally it consisted on two separate components:
- A standalone Python package with the OpenCV model.
- A server side API developed with FastAPI that consumes de model package, deployed in Vercel
- A dummy client side developed with HTML, CSS and Javascript deployed in GitHub Pages.
However, Vercel supports until 250MB of app size, and the model package, which uses OpenCV and another ML libraries, exceeded that number. Then, another solution had to be found. The model package was transformed into another API developed in Gradio and hosted in the free service Hugging Face. Then, the first API now consumes from this second API. Moreover, the image sending handling had to be changed, consisting in several transformations between lists, JSON strings and byte objects.
The resulting architecture is as follows:
- A dummy client side developed with HTML, CSS and Javascript deployed in GitHub Pages that calls API_1.
- An API_1 developed with FastAPI hosted in Vercel that calls API_2.
- An API_2 that hosts the model, developed with Gradio and hosted in Hugging Face.
- A Python package with the model, unused and replaced by API_2 because of hardware limitations.
API_1 supports automated CI/CD in GitHub Actions. Building includes installing of dependencies, testing in Pytest, formatting with black and linter. Deployment includes both GitHub Pages and Vercel.
The model Python package supports automated CI/CD in GitHub Actions. Building inludes installing of dependencies, testing in Pytest, formatting with black and linter and package building with setuptools. Deployment includes publishing to PyPI with Twine.
FULL TECHNOLOGIES STACK
[API]
- FastAPI
- Uvicorn
- Gradio
[Machine Learning]
- Numpy
- matplotlib
- Pillow
- OpenCV
[Services]
- GitHub Actions
- Vercel
- Hugging Face
- PyPI
[Testing, formatting and building]
- Pytest
- Flake8
- Black
- Pre-commit
- setuptools
- Twine
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file gamr_backend_api_service-1.0.9-py3-none-any.whl.
File metadata
- Download URL: gamr_backend_api_service-1.0.9-py3-none-any.whl
- Upload date:
- Size: 14.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
28b1eb2c94c115b36676da141e82e2f7586a2b40e19af741904d710ef74d21ec
|
|
| MD5 |
e5306a86ddf34d639613d9c858791226
|
|
| BLAKE2b-256 |
8fb02ea5739082001e58f8b96c4301e4af9d8031186aa637ead73d41e6346346
|
Provenance
The following attestation bundles were made for gamr_backend_api_service-1.0.9-py3-none-any.whl:
Publisher:
pipeline.yml on gastonamengual/GAMR-Backend-Service-Vercel
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
gamr_backend_api_service-1.0.9-py3-none-any.whl -
Subject digest:
28b1eb2c94c115b36676da141e82e2f7586a2b40e19af741904d710ef74d21ec - Sigstore transparency entry: 177402702
- Sigstore integration time:
-
Permalink:
gastonamengual/GAMR-Backend-Service-Vercel@d24327521697eabe1d930fbc0dd7ea7f693182c6 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/gastonamengual
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pipeline.yml@d24327521697eabe1d930fbc0dd7ea7f693182c6 -
Trigger Event:
push
-
Statement type: