Skip to main content

Inference locally.

Project description

LLM Inference, 开箱即用的本地 LLM 推理 SDK

安装

  • pip

    pip install git+https://gitlab.changdu.ltd/060270/llm_inference.git@main
    
  • uv

    uv add git+https://gitlab.changdu.ltd/060270/llm_inference.git@main
    

主要功能

  1. Seed-X 机器翻译

    from vllm_inference.translation.seed_x import translate
    
    translation = translate('你好, 我叫吴子豪!', 'fr', max_tokens=2048, temperature=.0, top_p=.01)
    

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

vllm_inference-0.0.18.tar.gz (3.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

vllm_inference-0.0.18-py3-none-any.whl (4.0 kB view details)

Uploaded Python 3

File details

Details for the file vllm_inference-0.0.18.tar.gz.

File metadata

  • Download URL: vllm_inference-0.0.18.tar.gz
  • Upload date:
  • Size: 3.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.6.14

File hashes

Hashes for vllm_inference-0.0.18.tar.gz
Algorithm Hash digest
SHA256 044f4c399df1e980220f83ecfb35f7a8942924825d14c08db80b8cf021e9229d
MD5 877b7e6d299c8bc03eae11ecd0b9eb82
BLAKE2b-256 0cc4797a0e2326c4b389a643b63499e1d3cca5ec8d64a87e15a4f500efa4b3d3

See more details on using hashes here.

File details

Details for the file vllm_inference-0.0.18-py3-none-any.whl.

File metadata

File hashes

Hashes for vllm_inference-0.0.18-py3-none-any.whl
Algorithm Hash digest
SHA256 481348bfa8351fd72da5f10fe26abadba65d3bcf17eeb6883570c96e25a12812
MD5 bf36fa7ae117e28b55d3b44741e87cce
BLAKE2b-256 302dc5eb4cc04b7262950c0580c8f0d178a60d73d76b1626c3d649e2d47d45e1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page