Additional metrics integrated with the keras NN library, taken directly from `Tensorflow <https://www.tensorflow.org/api_docs/python/tf/metrics/>`_
Project description
Additional metrics integrated with the Keras NN library, taken directly from Tensorflow
How do I install this package?
As usual, just download it using pip:
pip install extra_keras_metrics
Tests Coverage
Since some software handling coverages sometimes get slightly different results, here’s three of them:
How do I use this package?
Just by importing it you will be able to access all the non-parametric metrics, such as “auprc” and “auroc”:
import extra_keras_metrics
model = my_keras_model()
model.compile(
optimizer="sgd",
loss="binary_crossentropy",
metrics=["auroc", "auprc"]
)
For the parametric metrics, such as “average_precision_at_k”, you will need to import them, such as:
from extra_keras_metrics import average_precision_at_k
model = my_keras_model()
model.compile(
optimizer="sgd",
loss="binary_crossentropy",
metrics=[average_precision_at_k(1), average_precision_at_k(2)]
)
This way in the history of the model you will find both the metrics indexed as “average_precision_at_k_1” and “average_precision_at_k_2” respectively.
Which metrics do I get?
You will get all the following metrics taken directly from Tensorflow. At the time of writing, the ones available are the following:
The non-parametric ones are (tested against their conterpart from sklearn):
AUPRC (tested against sklearn’s average_precision_score).
AUROC (tested against sklearn’s roc_auc_score).
false_negatives (tested against false negatives from sklearn’s confusion_matrix).
false_positives (tested against false positives from sklearn’s confusion_matrix).
mean_absolute_error (tested against sklearn’s mean_absolute_error)
mean_squared_error (tested against sklearn’s mean_squared_error)
precision (tested against sklearn’s precision_score)
recall (tested against sklearn’s recall_score)
root_mean_squared_error (tested against squared root of sklean’s mean_squared_error)
true_negatives (tested against true negatives from sklearn’s confusion_matrix)
true_positives (tested against true positives from sklearn’s confusion_matrix)
The parametric ones are (only execution is tested, no baseline in sklearn was available):
Extras
I’ve created also another couple packages you might enjoy: one, called extra_keras_utils that contains some commonly used code for Keras projects and plot_keras_history which automatically plots a keras training history.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
File details
Details for the file extra_keras_metrics-1.3.1.tar.gz
.
File metadata
- Download URL: extra_keras_metrics-1.3.1.tar.gz
- Upload date:
- Size: 7.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.12.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.32.1 CPython/3.7.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1e8efb676a368ebb004e3dd1690304e22c6c58718f8b46eeb4ce0cb54be48ac3 |
|
MD5 | cbdd354ab2ef3c7e01ca76e6beef6575 |
|
BLAKE2b-256 | 820fb3b2ab62df1b85f753d9a8f0eb29f2e333d085d75a160751b1591be18053 |