A Python client for MLCore, providing a database-like interface for managing datasets and machine learning models.
Project description
MLCore Python Client
A powerful Python client for MLCore, providing a database-like interface for managing datasets and machine learning models. Built for data scientists and engineers who want to automate their machine learning workflows.
Links
- GitHub Repository: Amanbig/MLCore
- Docker Hub: procoder588/mlcore
Installation
pip install mlcore-client
Connection
Connect to your MLCore instance using a connection string (URI) or explicit parameters.
from mlcore import MLCore
# Option 1: Connection string (Recommended)
# Format: mlcore://user:password@host:port
client = MLCore("mlcore://admin:password@localhost:8000")
# Option 2: Explicit parameters
client = MLCore(
host="localhost",
port=8000,
email="admin@example.com",
password="password"
)
Asynchronous Support
For asyncio applications, use MLCoreAsync:
from mlcore import MLCoreAsync
client = MLCoreAsync("mlcore://admin:password@localhost:8000")
await client.connect()
# ... use await with all methods ...
await client.close()
API Documentation
📊 Dataset Manager (client.datasets)
Manage data lifecycle from raw files to cleaned versions.
| Method | Description | Parameters |
|---|---|---|
list() |
List all accessible datasets. | - |
get(id) |
Get metadata for a dataset. | id: UUID or string |
upload_file(path) |
Upload a raw CSV/Excel file. | path: Local file path |
create(...) |
Register a file as a dataset. | name, description, file_id, rows, columns, metadata |
get_data(id, ...) |
Fetch paginated rows. | id, page=1, limit=50, as_df=False |
clean(id, ...) |
Apply cleaning logic. | id, strategy ('drop_nulls', 'fill_mean', etc.), columns |
transform(id, ...) |
Apply transformations. | id, strategy ('standard_scaler', etc.), columns |
get_versions(id) |
Get history/lineage. | id |
delete(id) |
Permanently remove dataset. | id |
🤖 Model Manager (client.models)
Train, evaluate, and deploy machine learning models.
| Method | Description | Parameters |
|---|---|---|
list() |
List all trained models. | - |
get(id) |
Get model specs & metrics. | id: UUID or string |
train(...) |
Start a training job. | dataset_id, algorithm, target_column, features, hyperparameters, name |
predict(id, inputs) |
Run real-time inference. | id, inputs: Dict[feature_name, value] |
download(id, path) |
Download .joblib file. |
id, path: Local destination path |
retrain(id, ...) |
Run new training session. | id, dataset_id, algorithm, etc. |
get_hyperparameters(algo) |
Get valid hyperparams. | algo: Algorithm name |
get_versions(id) |
Get model history. | id |
update_meta(id, ...) |
Rename/update description. | id, name, description |
delete(id) |
Delete model artifacts. | id |
📈 General Methods
| Method | Description |
|---|---|
get_stats() |
Get platform-wide statistics (counts of models, datasets, files). |
health_check() |
Check server connectivity and version. |
Usage Examples
End-to-End Pipeline
# 1. Prepare Data
file_info = client.datasets.upload_file("raw_data.csv")
ds = client.datasets.create(name="Training Set", file_id=file_info["id"], ...)
cleaned_ds = client.datasets.clean(ds["id"], strategy="drop_nulls")
# 2. Train Model
model = client.models.train(
dataset_id=cleaned_ds["id"],
algorithm="random_forest",
target_column="label",
hyperparameters={"n_estimators": 200}
)
# 3. Monitor Specs
print(f"Model trained with {model['accuracy']}% accuracy")
# 4. Predict
result = client.models.predict(model["id"], inputs={"feature_1": 0.5, "feature_2": 1.2})
print(f"Prediction: {result['predictions']}")
Features
- Database-like Connection: URI-based connection strings (
mlcore://). - Session Persistence: Automatic token management and re-auth logic.
- Pandas Ready: One-click conversion from server data to DataFrames.
- Async First: First-class support for
httpx-based asynchronous I/O. - Developer Friendly: Strictly typed and linted with Ruff.
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file mlcore_client-0.1.2.tar.gz.
File metadata
- Download URL: mlcore_client-0.1.2.tar.gz
- Upload date:
- Size: 9.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1279f3803d5f086d28567f456df40f360dc314d9cc282cceb064f90d43706e70
|
|
| MD5 |
01e712ef61f1b84d2d2ae414cfa0e1a6
|
|
| BLAKE2b-256 |
4df408ccd9041ad08b1185e34f20987ebeddb21b64495d174bc5a8f5af7c083a
|
Provenance
The following attestation bundles were made for mlcore_client-0.1.2.tar.gz:
Publisher:
release.yml on Amanbig/mlcore-client
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
mlcore_client-0.1.2.tar.gz -
Subject digest:
1279f3803d5f086d28567f456df40f360dc314d9cc282cceb064f90d43706e70 - Sigstore transparency entry: 1340370208
- Sigstore integration time:
-
Permalink:
Amanbig/mlcore-client@1aa3d47d9df63a4272698484e4ca1015884115c4 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/Amanbig
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@1aa3d47d9df63a4272698484e4ca1015884115c4 -
Trigger Event:
push
-
Statement type:
File details
Details for the file mlcore_client-0.1.2-py3-none-any.whl.
File metadata
- Download URL: mlcore_client-0.1.2-py3-none-any.whl
- Upload date:
- Size: 8.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3a2dd16cd9c5459ab6708823c2fc547d3311079f798af2853b911d897ff28667
|
|
| MD5 |
637a6b833f26df0586be4d27e5ede1e9
|
|
| BLAKE2b-256 |
cca929e0207ceb5357f8f2809473651f68720911b4965e35eea8ec5501ed2fac
|
Provenance
The following attestation bundles were made for mlcore_client-0.1.2-py3-none-any.whl:
Publisher:
release.yml on Amanbig/mlcore-client
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
mlcore_client-0.1.2-py3-none-any.whl -
Subject digest:
3a2dd16cd9c5459ab6708823c2fc547d3311079f798af2853b911d897ff28667 - Sigstore transparency entry: 1340370212
- Sigstore integration time:
-
Permalink:
Amanbig/mlcore-client@1aa3d47d9df63a4272698484e4ca1015884115c4 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/Amanbig
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@1aa3d47d9df63a4272698484e4ca1015884115c4 -
Trigger Event:
push
-
Statement type: