Skip to main content

hosting nlp models for demo purpose

Project description

nlp2go - hosting nlp models for demo purpose

Example

hosting single model

nlp2go --model model_path --predictor biotag

hosting multiple models

  1. create a json file as below:
{
    "API1_PATH": {
      "model": "model1_path",
      "predictor": "predictor_tag"
    },
    "API2_PATH": {
      "model": "model2_path",
      "predictor": "predictor_tag"
    }
}
  1. run
nlp2go --json json_file_path  

Installation

Installing via pip

pip install nlp2go

Running nlprep

Once you've installed nlprep, you can run with

python -m nlp2go.server # local version
or
nlp2go # pip installed version

and the following parameter:

$ nlp2go
arguments:
  --model       model path 
  or   
  --json        json file include models setting 

  --outdir      processed result output directory       

optional arguments:
  -h, --help    show this help message and exit
  --predictor   formatting result on different kind of task    ['biotag', 'tag', 'default']  
  --path        api path
  --port        api hosting port
  --cli         command line mode

Json file example

{
    "API1_PATH": {
      "model": "model1_path",
      "predictor": "predictor_tag"
    },
    "API2_PATH": {
      "model": "model2_path",
      "predictor": "predictor_tag"
    }
}

Expose application over the web

I recommend using ngrok to expose this api for demo purpose
Ngrok: https://ngrok.com

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nlp2go-0.0.10.tar.gz (4.4 kB view hashes)

Uploaded Source

Built Distribution

nlp2go-0.0.10-py3-none-any.whl (9.2 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page