Skip to main content

No project description provided

Project description

FlagEval Serving

Serving Framework of AI Models for Evaluating on FlagEval Platform.

Installation

pip install --upgrade flageval-serving

Usage

  1. Model: of course we have a model that is ready be evaluated, let's assume it lives in the path: /path/to/model;

  2. Then we can write our service code, let's put the service code in service.py or './tests/service.py' and take a NLP model as the example:

    from flageval.serving.service import NLPModelService, NLPEvalRequest, NLPEvalResponse, NLPCompletion
    
    
    class DemoService(NLPModelService):
        def global_init(self, model_path: str):
            print("Initial model with path", model_path)
    
        def infer(self, req: NLPEvalRequest) -> NLPEvalResponse:
            return NLPEvalResponse(
                completions=[
                    NLPCompletion(
                        text='Hello, world!',
                        tokens='Hello, world!',
                    ),
                ]
            )
    
  3. Finally, we use the flageval-serving command to serve:

    flageval-serving --service service:DemoService dev /path/to/model  # start a development server
    flageval-serving --service service:DemoService run /path/to/model  # start a production server
    

Dockerfile

FlagEval evaluation platform construction image

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

flageval_serving-0.2.3.tar.gz (12.1 kB view details)

Uploaded Source

File details

Details for the file flageval_serving-0.2.3.tar.gz.

File metadata

  • Download URL: flageval_serving-0.2.3.tar.gz
  • Upload date:
  • Size: 12.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.7

File hashes

Hashes for flageval_serving-0.2.3.tar.gz
Algorithm Hash digest
SHA256 8015dab6de6ae38f3a3e9ce886174eea2aa00722791599cf7851c223172f2404
MD5 df8581ec61e83c011aafe3aadb28cca1
BLAKE2b-256 1b035f7355eb37fe769efbcc4f7f3d873cf0fb490fc57e16c6ea3168b0591bbf

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page