HCC Algorithm for FHIR Resources
Project description
hccinfhir (HCC in FHIR)
HCC (Hierarchical Condition Category) Algorithm Implementation for FHIR Resources
Overview
hccinfhir implements the CMS-HCC Risk Adjustment model using FHIR (Fast Healthcare Interoperability Resources) data. It processes Blue Button 2.0 API (BCDA) data to calculate Hierarchical Condition Category (HCC) Risk Adjustment Factors (RAF).
Why FHIR-Based HCC Processing?
Risk Adjustment calculations traditionally rely on processed claims data, leading to information loss and reconciliation challenges. hccinfhir processes FHIR resources directly because:
- FHIR represents the source of truth with complete clinical and administrative data
- Risk Adjustment requires multiple data elements beyond diagnosis codes
- Direct processing eliminates data transformation errors and simplifies reconciliation
Data Flexibility
While built for native FHIR processing, hccinfhir works with any data source that can be transformed into the MDE format:
mde = [{
"procedure_code": "99214",
"diagnosis_codes": ["E11.9", "I10"],
"claim_type": "71",
"provider_specialty": "01",
"service_date": "2024-01-15"
}, ...]
Components
1. Extractor Module
Processes FHIR ExplanationOfBenefit resources to extract Minimum Data Elements (MDE):
from hccinfhir.extractor import extract_mde, extract_mde_list
mde = extract_mde(eob_data) # Process single EOB
mde_list = extract_mde_list([eob1, eob2]) # Process multiple EOBs
2. Logic Module (In Development)
Implements core HCC calculation logic:
- Maps diagnosis codes to HCC categories
- Applies hierarchical rules and interactions
- Calculates final RAF scores
- Integrates with standard CMS data files
3. Filter Module (In Development)
Implements claim filtering rules:
- Inpatient/outpatient criteria
- Professional service requirements
- Provider validation
- Date range filtering
Installation
pip install hccinfhir
Usage
from hccinfhir import HCCInFHIR
hcc_processor = HCCInFHIR()
mde_list = hcc_processor.extract_mde_list(eob_list)
filtered_mde = hcc_processor.apply_filters(mde_list) # future
raf_details = hcc_processor.calculate_raf(filtered_mde, demographic_data) # future
Dependencies
- Pydantic >= 2.10.3
- Standard Python libraries
Contributing
Join us at mimilabs. Reference data available in MIMILabs data lakehouse.
License
Apache License 2.0
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file hccinfhir-0.0.1.tar.gz.
File metadata
- Download URL: hccinfhir-0.0.1.tar.gz
- Upload date:
- Size: 9.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: python-httpx/0.28.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
49f217e9f54a49061d4c95b07f547628342ed800c9b63c342dd199ace727ad8c
|
|
| MD5 |
ef458ae233ca57322f35f58e772613a7
|
|
| BLAKE2b-256 |
29691ee5ed3716331eddf94555a162f8388c0126cd801e09248666639bb68e42
|
File details
Details for the file hccinfhir-0.0.1-py3-none-any.whl.
File metadata
- Download URL: hccinfhir-0.0.1-py3-none-any.whl
- Upload date:
- Size: 10.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: python-httpx/0.28.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
92099c6b4ec092996da8efb8632eb0d4a2948d76f33bb50465b994ad150ad354
|
|
| MD5 |
d1f57fd2af7be87cdd666a818daf171d
|
|
| BLAKE2b-256 |
ff461e9c46b39bac09d0b9f5d94e4fe93954dd291da8d2c7f5fa2820665ce6bc
|