No project description provided
Project description
FlagEval Serving
Serving Framework of AI Models for Evaluating on FlagEval Platform.
Installation
pip install --upgrade flageval-serving
Usage
-
Model: of course we have a model that is ready be evaluated, let's assume it lives in the path:
/path/to/model
; -
Then we can write our service code, let's put the service code in
service.py
or './tests/service.py' and take a NLP model as the example:from flageval.serving.service import NLPModelService, NLPEvalRequest, NLPEvalResponse, NLPCompletion class DemoService(NLPModelService): def global_init(self, model_path: str): print("Initial model with path", model_path) def infer(self, req: NLPEvalRequest) -> NLPEvalResponse: return NLPEvalResponse( completions=[ NLPCompletion( text='Hello, world!', tokens='Hello, world!', ), ] )
-
Finally, we use the
flageval-serving
command to serve:flageval-serving --service service:DemoService dev /path/to/model # start a development server flageval-serving --service service:DemoService run /path/to/model # start a production server
Dockerfile
FlagEval evaluation platform construction image
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
flageval_serving-0.2.1.tar.gz
(11.9 kB
view details)
File details
Details for the file flageval_serving-0.2.1.tar.gz
.
File metadata
- Download URL: flageval_serving-0.2.1.tar.gz
- Upload date:
- Size: 11.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.1 importlib_metadata/7.0.2 pkginfo/1.10.0 requests/2.26.0 requests-toolbelt/1.0.0 tqdm/4.64.1 CPython/3.9.18
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e3e2d3fedf2bb1cd268ee6e3e7c043d3b207489f9f85630d26d5ce0b5152c2be |
|
MD5 | e5bbea7cab0bb3e87f047abe18c3c75f |
|
BLAKE2b-256 | da842b1762bd787b68549a4b4e0d0cccb9be9cd7016df9de009a3b14a4e733d7 |