Medical Record Chart Review Calculator
Project description
Chart Review
Measure agreement between chart reviewers.
Whether your chart annotations come from humans, machine-learning, or coded data like ICD-10,
chart-review
can compare them to reveal interesting statistics like:
Accuracy
- F1-score (agreement)
- Cohen's Kappa (agreement)
- Sensitivity and Specificity
- Positive (PPV) or Negative Predictive Value (NPV)
- False Negative Rate (FNR)
Confusion Matrix
- TP = True Positive (type I error)
- TN = True Negative (type II error)
- FP = False Positive
- FN = False Negative
Documentation
For guides on installing & using Chart Review, read our documentation.
Example
$ ls
config.yaml labelstudio-export.json
$ chart-review accuracy jill jane
Comparing 3 charts (1, 3–4)
Truth: jill
Annotator: jane
F1 Sens Spec PPV NPV Kappa TP FN TN FP Label
0.667 0.75 0.6 0.6 0.75 0.341 3 1 3 2 *
0.667 0.5 1.0 1.0 0.5 0.4 1 1 1 0 Cough
1.0 1.0 1.0 1.0 1.0 1.0 2 0 1 0 Fatigue
0 0 0 0 0 0 0 0 1 2 Headache
Contributing
We love 💖 contributions!
If you have a good suggestion 💡 or found a bug 🐛, read our brief contributors guide for pointers to filing issues and what to expect.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
chart_review-1.2.0.tar.gz
(32.9 kB
view hashes)
Built Distribution
Close
Hashes for chart_review-1.2.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 82f74d7791b2d53e0e5631ed4810998e8ec7eb2540ff00a3e08d5f7f007ebed9 |
|
MD5 | 18e37cb7f69b4e6761597645e426c624 |
|
BLAKE2b-256 | 7207681aa179471455ca4475d9db792e57aa09f8f2c00f11cdd5c18b51bc937e |