Skip to main content

cli

Project description

ANC CLI Tool

Overview

The ANC CLI tool is a comprehensive command line interface designed to facilitate the management of various resources within the company. Initially, it supports managing datasets and their versions, enabling users to interact seamlessly with a remote server for fetching, listing, and adding datasets.

Installation

For User

# Instructions for installing the ANC CLI tool
sudo pip install anc

For cli Develop

# Instructions for installing the ANC CLI tool
cd dev/cli
sudo pip install -r requirements.txt
sudo pip install -e .

for release

For build and release instructions, see Release Guide.

Dataset

  • Fetch Datasets: Retrieve specific versions of datasets from a remote server.
  • List Versions: View all available versions of a dataset.
  • Add Datasets: Upload new datasets along with their versions and descriptions to the remote server.

Usage

list

anc ds list 
# Or you can specify a dataset name.
anc ds list -n <dataset name>

get

# According to the above list result, you can download the specific version dataset.
# Ensure that the destination path for downloads is a permanent storage location(e.g. /mnt/weka/xxx). Currently, downloading data to local storage is not permitted.
anc ds get cifar-10-batches-py -v 1.0

add

# Upload a specific version of a dataset. The dataset name will be determined based on the file or folder name extracted from the specified path.
# Ensure that the dataset is stored in a permanent location recognized by the server (e.g., /mnt/weka/xxx).
anc ds add /mnt/weka/xug/dvc_temp/cifar-10-batches-py -v 1.0

load-test

load test with real data

pip install vllm

# Runload test. this will start the server and send the benchmark requests, save the results to the json file and plot the results
anc loadtest run \
--model /mnt/share/ocean/candidate1 \
--max-model-len 8000 \
--backend vllm \
--port 8004 \
--tensor-parallel-size "4" \
--enable-prefix-caching "True" \
---dataset-name anc \
---dataset-path /mnt/share/infra/hongbo/load_test_data/1000_ocean_prompt_pressure_test_v2_02_25.jsonl \
---num-prompts 300 \
---max-concurrency "1,2,4, 6, 8, 10, 12, 24" \
---dataset-name anc \
---dataset-path /mnt/share/infra/hongbo/load_test_data/1000_ocean_prompt_pressure_test_v2_02_25.jsonl \
---num-prompts 300 \
---max-concurrency "1,2,4, 6, 8, 10, 12, 24" \
--result-dir "./test" \
--gpu-memory-utilization 0.8 \
--seed 10


### you can also plot the results directly from the json file
anc loadtest plot --dataset-name anc ./test/all_results.json
## # if you have multiple json files in the same directory, you can plot all of them by
anc loadtest plot --dataset-name anc ./test

load test with random data

anc loadtest run \
--model /mnt/project/llm/ckpt/stable_ckpts/Llama-3.2-1B/ \
--max-model-len 200 \
--backend vllm \
--port 8004 \
--dataset-name random \
--num-prompts 1 \
--max-concurrency "1" \
--random-input-len "10" \
--result-dir "./test"  \
--skip-server

load test with remote endpoint

grid search with server parametgers won't work with this method. as we won't be able to restart the remote server(i.e. TP will be what ever tp used by the endpoint). Also prefix caching will be controlled by server, so you might end up with very high hit rate if your request sample size is small

anc loadtest run \
--model /mnt/share/ocean/candidate1 \
 --model-id ocean-llm \
--backend vllm \
 --dataset-name anc \
 --dataset-path /mnt/share/infra/hongbo/load_test_data/1000_ocean_prompt_pressure_test_v2_02_25.jsonl \
 --num-prompts 10 \
 --max-concurrency "1,2, 4" \
 --result-dir "./test" \
 --seed 10 \
 --skip-server \
 --base-url "http://ocean-test-2.serving-prod.va-mlp.anuttacon.com" 

load test with remote endpoint

grid search with server parametgers won't work with this method. as we won't be able to restart the remote server(i.e. TP will be what ever tp used by the endpoint). Also prefix caching will be controlled by server, so you might end up with very high hit rate if your request sample size is small

anc loadtest run \
--model /mnt/share/ocean/candidate1 \
 --model-id ocean-llm \
--backend vllm \
 --dataset-name anc \
 --dataset-path /mnt/share/infra/hongbo/load_test_data/1000_ocean_prompt_pressure_test_v2_02_25.jsonl \
 --num-prompts 10 \
 --max-concurrency "1,2, 4" \
 --result-dir "./test" \
 --seed 10 \
 --skip-server \
 --base-url "http://ocean-test-2.serving-prod.va-mlp.anuttacon.com" 

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

anc-0.0.13.tar.gz (94.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

anc-0.0.13-py3-none-any.whl (111.2 kB view details)

Uploaded Python 3

File details

Details for the file anc-0.0.13.tar.gz.

File metadata

  • Download URL: anc-0.0.13.tar.gz
  • Upload date:
  • Size: 94.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.12.3 Linux/5.15.0-94-generic

File hashes

Hashes for anc-0.0.13.tar.gz
Algorithm Hash digest
SHA256 1d6cbf5344fbc8270b2bd5e1de7c7b30edd795020ec10a080dea5867c98cccc6
MD5 6eea494c6a4913f6f56e99791668a601
BLAKE2b-256 e9f82b6c1d53ee339fc1239019f6a7df11e2ffa798d6da4d9a983f1842217c78

See more details on using hashes here.

File details

Details for the file anc-0.0.13-py3-none-any.whl.

File metadata

  • Download URL: anc-0.0.13-py3-none-any.whl
  • Upload date:
  • Size: 111.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.12.3 Linux/5.15.0-94-generic

File hashes

Hashes for anc-0.0.13-py3-none-any.whl
Algorithm Hash digest
SHA256 5d4f54ea066bc0cc5ec9a9a1739a7ce421449ffb4e0dcada95fa7139bdcca09c
MD5 24dba731f94c784723d1fb2b740b5ccb
BLAKE2b-256 c6652d8a8b60a3cbc9f8e5e319995ad40fee87c2984ce9fde68e420bff8e01ab

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page