Skip to main content

Neptune Client

Project description

neptune.ai

Quickstart   •   Website   •   Docs   •   Examples   •   Resource center   •   Blog  

What is neptune.ai?

Neptune is a lightweight experiment tracker for ML teams that struggle with debugging and reproducing experiments, sharing results, and messy model handover. It offers a single place to track, compare, store, and collaborate on experiments and models.

With Neptune, Data Scientists can develop production-ready models faster, and ML Engineers can access model artifacts instantly in order to deploy them to production.  

Watch a 3min explainer video →  

Watch a 20min product demo →  

Play with a live example project in the Neptune app →  

Getting started

Step 1: Create a free account

Step 2: Install Neptune client library

pip install neptune

Step 3: Add experiment tracking snippet to your code

import neptune

run = neptune.init_run(project="Me/MyProject")
run["parameters"] = {"lr": 0.1, "dropout": 0.4}
run["test_accuracy"] = 0.84

Open in Colab  

 

Core features

Log and display

Add a snippet to any step of your ML pipeline once. Decide what and how you want to log. Run a million times.

  • Any framework: any code, PyTorch, PyTorch Lightning, TensorFlow/Keras, scikit-learn, LightGBM, XGBoost, Optuna, Kedro.

  • Any metadata type: metrics, parameters, dataset and model versions, images, interactive plots, videos, hardware (GPU, CPU, memory), code state.

  • From anywhere in your ML pipeline: multinode pipelines, distributed computing, log during or after execution, log offline, and sync when you are back online.  

 

all metadata metrics
 

 

Organize experiments

Organize logs in a fully customizable nested structure. Display model metadata in user-defined dashboard templates.

  • Nested metadata structure: flexible API lets you customize the metadata logging structure however you want. Organize nested parameter configs or the results on k-fold validation splits the way they should be.

  • Custom dashboards: combine different metadata types in one view. Define it for one run. Use anywhere. Look at GPU, memory consumption, and load times to debug training speed. See learning curves, image predictions, and confusion matrix to debug model quality.

  • Table views: create different views of the runs table and save them for later. You can have separate table views for debugging, comparing parameter sets, or best experiments.  

 

organize dashboards
 

 

Compare results

Visualize training live in the neptune.ai web app. See how different parameters and configs affect the results. Optimize models quicker.

  • Compare: learning curves, parameters, images, datasets.

  • Search, sort, and filter: experiments by any field you logged. Use our query language to filter runs based on parameter values, metrics, execution times, or anything else.

  • Visualize and display: runs table, interactive display, folder structure, dashboards.

  • Monitor live: hardware consumption metrics, GPU, CPU, memory.

  • Group by: dataset versions, parameters.  

 

compare, search, filter
 

 

Version models

Version, review, and access production-ready models and metadata associated with them in a single place.

  • Version models: register models, create model versions, version external model artifacts.

  • Review and change stages: look at the validation, test metrics and other model metadata. You can move models between None/Staging/Production/Archived.

  • Access and share models: every model and model version is accessible via the neptune.ai web app or through the API.  

 

register models
 

 

Share results

Have a single place where your team can see the results and access all models and experiments.

  • Send a link: share every chart, dashboard, table view, or anything else you see in the neptune.ai app by copying and sending persistent URLs.

  • Query API: access all model metadata via neptune.ai API. Whatever you logged, you can query in a similar way.

  • Manage users and projects: create different projects, add users to them, and grant different permissions levels.

  • Add your entire org: get unlimited users on every paid plan. So you can invite your entire organization, including product managers and subject matter experts at no extra cost.  

 

share persistent link
 

 

Integrate with any MLOps stack

neptune.ai integrates with 25+ frameworks: PyTorch, PyTorch Lightning, TensorFlow/Keras, LightGBM, scikit-learn, XGBoost, Optuna, Kedro, 🤗 Transformers, fastai, Prophet, and more.



PyTorch Lightning

Example:

from pytorch_lightning import Trainer
from pytorch_lightning.loggers import NeptuneLogger

# Create NeptuneLogger instance
from neptune import ANONYMOUS_API_TOKEN

neptune_logger = NeptuneLogger(
    api_key=ANONYMOUS_API_TOKEN,
    project="common/pytorch-lightning-integration",
    tags=["training", "resnet"],  # optional
)

# Pass the logger to the Trainer
trainer = Trainer(max_epochs=10, logger=neptune_logger)

# Run the Trainer
trainer.fit(my_model, my_dataloader)

neptune-pl  

github-code jupyter-code Open In Colab  

 

neptune.ai is trusted by great companies

 

Read how various customers use Neptune to improve their workflow.  

 

Support

If you get stuck or simply want to talk to us about something, here are your options:

 

People behind

Created with :heart: by the neptune.ai team:

Piotr, Paulina, Jakub, Tomek, Magda, Aurimas, Chaz, Alexandra, Marcin, Tymoteusz, Parth, Aleksandra, Sabine, Tytus, Patrycja, Dawid, Dominika, Karolina, Aleksiej, Artur, Prince, Siddhant, Kshiteej, Piotr, Hubert, Adam, Rafał, Patryk, Bartosz, Jakub, Artsiom, Jakub, Marcin, Jakub, Paweł, Franciszek, Bartosz, Siamion, Aleksander, Małgorzata, Michał, Karolina, Martyna, and you?

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

neptune-1.8.5.tar.gz (264.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

neptune-1.8.5-py3-none-any.whl (479.7 kB view details)

Uploaded Python 3

File details

Details for the file neptune-1.8.5.tar.gz.

File metadata

  • Download URL: neptune-1.8.5.tar.gz
  • Upload date:
  • Size: 264.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for neptune-1.8.5.tar.gz
Algorithm Hash digest
SHA256 27d59cbae399b18392d2d77d7ef9812f8a2842f78c0471066fc63f6d17464599
MD5 be192872b408cba316afe7f12a1895a6
BLAKE2b-256 9c8807f30e466cfbb0f5f03393913bada891d4dcead85247800d47fd2a716002

See more details on using hashes here.

File details

Details for the file neptune-1.8.5-py3-none-any.whl.

File metadata

  • Download URL: neptune-1.8.5-py3-none-any.whl
  • Upload date:
  • Size: 479.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for neptune-1.8.5-py3-none-any.whl
Algorithm Hash digest
SHA256 53465afff4dce5a465f171db428abafe354df9cd1451b043e3ee414d87713736
MD5 80646093f8afe8751ae23abf9aa561bb
BLAKE2b-256 826c543a4d807cc0b2e03ec50f74d2ddf150903281e3fd5e29d9ff7ac9d0e658

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page