taichu serve is a tool for serving deep learning inference
Project description
## Quick Start
### 环境要求 - Python 3.6+ - Docker
### 1. 安装
`bash pip3 install taichu-serve -i https://pypi.tuna.tsinghua.edu.cn/simple `
### 2. 初始化项目
`bash taichu_serve init ` 执行后会在当前目录下生成一个名为`project`的文件夹
### 3. 启动模型服务
`bash cd project taichu_serve run `
### 4. 发起预测请求
`bash # 测试http请求 python3 test/http_client.py # 测试grpc请求 python3 test/grpc_client.py # 测试流式请求 python3 test/stream_grpc_client.py `
### 5. 构建可部署的镜像
`bash taichu_serve build --from_image {你的基础镜像} # 例如 taichu_serve build --from_image python:3.7 ` 命令成功后会在本地生成一个可在平台部署的镜像,镜像名关注命令行输出的`Successfully built`后的镜像名
### 6. 上传镜像到平台 本地测试完毕后,将镜像上传到平台swr仓库 `bash docker tag {镜像名} {平台仓库地址}/{镜像名} docker push {平台仓库地址}/{镜像名} `
## 项目目录结构说明 执行`taichu_serve init`后,会在当前目录下生成一个名为`project`的文件夹,目录结构如下: `bash project ├── test │ ├── grpc_client.py # grpc测试客户端 │ ├── http_client.py # http测试客户端 │ └── stream_grpc_client.py # 流式grpc测试客户端 ├── customize_service.py # 模型服务自定义逻辑 ├── models # 模型文件夹 ├── config.ini # 项目配置文件 ├── requirements.txt # 项目pip依赖 ├── dependencies.txt # 项目apt依赖 ├── launch.sh # 项目启动脚本,请不要修改内容 └── Dockerfile # 项目dockerfile `
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for taichu_serve-2.0.13-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7cedf2521e4cc2c1254f6b4aba907d283142f87990a59bdd331b0ab23095e87f |
|
MD5 | 625164b68fbd9ee98ce62440671bdc60 |
|
BLAKE2b-256 | a8543b892046835a0f37975fbde6b4c81090906269ae7beeda3a3bb11a91dce8 |