FLOPS estimation, memory analysis, and per-layer profiling for Core ML models
Project description
coreml-complexity-analyzer
FLOPS estimation, memory analysis, and per-layer profiling for Core ML models.
Why This Exists
Before deploying Core ML models to Apple devices, developers need to understand:
- Compute cost: How many FLOPS/MACs does the model require?
- Memory footprint: How much memory for parameters and activations?
- Bottleneck layers: Which layers dominate compute and memory?
coremltools does not provide these analysis capabilities out of the box. This package fills that gap.
Installation
pip install coreml-complexity-analyzer
Requirements: coremltools >= 7.0, numpy
Quick Start
import coremltools as ct
from coreml_complexity_analyzer import generate_report
model = ct.models.MLModel("my_model.mlpackage")
report = generate_report(model, "ResNet50")
print(f"Total GFLOPS: {report.total_gflops:.2f}")
print(f"Parameters: {report.parameters_millions:.2f}M")
print(f"Memory: {report.memory_breakdown.total_mb:.2f} MB")
# Full markdown report
print(report.to_markdown())
API
FLOPSAnalyzer
Compute floating-point operations for each layer in a model.
from coreml_complexity_analyzer import FLOPSAnalyzer
analyzer = FLOPSAnalyzer(model)
total = analyzer.get_total_flops()
breakdown = analyzer.get_layer_breakdown()
by_type = analyzer.get_flops_by_op_type()
MemoryEstimator
Estimate memory requirements (parameters + activations + overhead).
from coreml_complexity_analyzer import MemoryEstimator
estimator = MemoryEstimator(model)
breakdown = estimator.estimate()
param_count = estimator.get_parameter_count()
print(f"Total: {breakdown.total_mb:.2f} MB")
print(f"Parameters: {breakdown.parameter_mb:.2f} MB")
LayerProfiler
Detailed per-layer analysis combining FLOPS, memory, and shape information.
from coreml_complexity_analyzer import LayerProfiler
profiler = LayerProfiler(model)
profiles = profiler.profile()
top_layers = profiler.get_top_layers(n=5, by="flops")
for layer in top_layers:
print(f"{layer.name}: {layer.mflops:.2f} MFLOPS")
generate_report
Generate comprehensive reports combining all analyses.
from coreml_complexity_analyzer import generate_report
report = generate_report(model, "ModelName")
# Output formats
print(report.to_markdown()) # Markdown table
print(report.to_text()) # Plain text
data = report.to_dict() # Dictionary
Supported Operations
| Category | Operations |
|---|---|
| Convolutions | conv, conv_transpose |
| Linear | linear, matmul, einsum |
| Activations | relu, sigmoid, tanh, gelu, silu, softplus |
| Normalization | batch_norm, layer_norm, instance_norm |
| Pooling | max_pool, avg_pool |
| Element-wise | add, sub, mul, div |
| Reductions | reduce_sum, reduce_mean, reduce_max, reduce_min |
| Other | softmax |
Example Output
============================================================
Model Complexity Report: ResNet50
============================================================
SUMMARY
----------------------------------------
Total FLOPS: 4,089,184,256
Total GFLOPS: 4.09
Parameters: 25,557,032
Parameters (M): 25.56
MEMORY ANALYSIS
----------------------------------------
Total Memory: 106.42 MB
Parameters: 97.52 MB
TOP OPERATIONS BY FLOPS
----------------------------------------
conv 3,891,200,000 ( 95.2%)
linear 102,400,000 ( 2.5%)
batch_norm 51,200,000 ( 1.3%)
============================================================
Use Cases
- Model Optimization: Identify bottleneck layers before applying compression
- Deployment Planning: Estimate if a model fits device constraints
- Model Comparison: Compare efficiency across architectures
- Research: Analyze compute/memory trade-offs
Development
git clone https://github.com/yaswanth169/coreml-complexity-analyzer.git
cd coreml-complexity-analyzer
pip install -e ".[dev]"
pytest
License
BSD-3-Clause
Authors
- Devavarapu Yashwanth
- Ireddi Rakshitha
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file coreml_complexity_analyzer-0.1.0.tar.gz.
File metadata
- Download URL: coreml_complexity_analyzer-0.1.0.tar.gz
- Upload date:
- Size: 16.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
08ea025013bb1490ab14ad7b8b459594ae8eee7b55980ba799c458fb571b3e17
|
|
| MD5 |
772afe9e8059f45a9f9b3ab6bec10250
|
|
| BLAKE2b-256 |
ec90e151d1da6403e1a0752803c9abff66a10067772aaa452a2331ef35e4eb4a
|
File details
Details for the file coreml_complexity_analyzer-0.1.0-py3-none-any.whl.
File metadata
- Download URL: coreml_complexity_analyzer-0.1.0-py3-none-any.whl
- Upload date:
- Size: 17.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0ff07f4aab6d1ff216902fc11a132c19de58ecfe604d3739da896061235f57b0
|
|
| MD5 |
058ad4dd9a26b1d6dec48af543f93e81
|
|
| BLAKE2b-256 |
5ffcf58b9488a2d2b0785a9d94330f6784236cb94e635c2f3516e132588e3e39
|