Skip to main content

Project has been completed.

Project description

How to setup airflow

Set airflow directory

export AIRFLOW_HOME="/home/avnish/census_consumer_project/census_consumer_complaint/airflow"

To install airflow

pip install apache-airflow

To configure databse

airflow db init

To create login user for airflow

airflow users create  -e avnish@ineuron.ai -f Avnish -l Yadav -p admin -r Admin  -u admin

To start scheduler

airflow scheduler

To launch airflow server

airflow webserver -p <port_number>
pip install pandas-tfrecords
pip install \
  --upgrade --ignore-installed \
  python-snappy==0.5.1 \
  --global-option=build_ext \
  --global-option="-I/usr/local/include" \
  --global-option="-L/usr/local/lib"

pip install twine python setup.py sdist bdist_wheel twine upload --repository-url https://test.pypi.org/legacy/ dist/* twine upload dist/*

To deploy your model

pip install tensorflow-serving-api

to inspect model

saved_model_cli show --dir <dir_path>

Above command will return tag set

saved_model_cli show --dir <dir_path> --tag_set <tag_name>

Above command will show available model signatures

Next: with tag_set and signature_def info, we can inspect model input and output

saved_model_cli show --dir <dir_path> --tag_set <tag_name> --signature_def <SignatureDef Key>

To inspect all signature without tag_set and signature_def saved_model_cli show --dir <dir_path> --all

Testing the model


Test model prediction using saved_model_cli with sample input data

--input_examples: input data formatted as a tf.Example data structure

other param

--outdir: by default output will be written in terminal

--overwrite: to write into a file

tf_debug: run in debug mode

To expose your model as an API using docker image tensorflow/serving

docker pull tensorflow/serving

Single model configuration

sudo docker run -p 8500:8500 \
-p 8501:8501\
--volumn <model_dir>:<target_dir>\
-e MODEL_NAME=<model_name>\
-e model_base_path=<target_dir>\
-t tensorflow/serving:latest
sudo docker run -p 8500:8500 -p 8501:8501 \
-v  /home/avnish/census_consumer_project/census_consumer_complaint/census_consumer_complaint_data/saved_models:/avnish/my_model \
-e MODEL_NAME=my_model \
-e MODEL_BASE_PATH=/avnish \
-t tensorflow/serving:latest

To inspect docker container directory

docker exec -it <conatiner_name> bash

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

census-consumer-complaint-0.1.0.tar.gz (17.6 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file census-consumer-complaint-0.1.0.tar.gz.

File metadata

File hashes

Hashes for census-consumer-complaint-0.1.0.tar.gz
Algorithm Hash digest
SHA256 f6c7f369709911f7df7924dfaae1a3f5204ec1b322b413abb0408223b67bbcd7
MD5 be1a830b03852196696b05c8e823b7fe
BLAKE2b-256 50ea04f4087ee590fb8a932b0a7cb624c5d2146deec235f2bc0c6acd535b53a5

See more details on using hashes here.

File details

Details for the file census_consumer_complaint-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for census_consumer_complaint-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 11d8d5e80d0f15b4463efece6af1ca78a12527afe03d8ee6b8be7658bc206a80
MD5 8b127e56d21dd595c8f1eaee4675097d
BLAKE2b-256 3b1edc72e6154ea750ca459b79751b70b41e077fa087de936acb3a10df4d1c3a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page