Local-first AI performance profiler that mathematically verifies optimizations for Python, C++, and CUDA
Project description
CoreInsight
AI-powered performance profiler for Python, C++, and CUDA.
CoreInsight finds hardware bottlenecks in your code, generates optimized replacements, and verifies the speedup mathematically inside an isolated Docker sandbox — all running locally on your machine.
Install
pip install coreinsight-cli
Requirements: Python 3.9+ · Docker Desktop · Ollama (for local inference)
Quick start
# Configure your AI provider (defaults to Ollama + llama3.2)
coreinsight configure
# Run the built-in demo
coreinsight demo
# Analyse your own file
coreinsight analyze path/to/your_file.py
What it does
CoreInsight runs a full optimization pipeline on every function it extracts:
- Bottleneck analysis
- Code generation
- Sandbox verification
- Hardware profiling
Every result is stored in a local vector database. On repeat analyses, matching patterns are recalled instantly — no LLM call, no sandbox spin-up.
Commands
| Command | Description |
|---|---|
coreinsight analyze <file> |
Analyse a .py, .cpp, or .cu file |
coreinsight demo [--lang cpp] |
Run on a built-in example |
coreinsight configure |
Set up AI provider and API keys |
coreinsight configure --pro-key <key> |
Activate Pro tier |
coreinsight memory |
Inspect stored optimizations |
coreinsight memory --clear |
Wipe the memory store |
coreinsight memory --export out.csv |
Export memory to CSV or Markdown |
coreinsight index [--dir <path>] |
Index a repo for cross-file RAG context |
coreinsight scan [--dir <path>] |
Rank hotspots by complexity without LLM |
coreinsight view |
Launch the interactive TUI |
All commands accept --no-docker to skip sandboxing when Docker is unavailable.
Supported languages
| Language | Analysis | Benchmarking | Correctness |
|---|---|---|---|
| Python | ✅ | ✅ | ✅ |
| C++ | ✅ | ✅ | ✅ |
| CUDA | ✅ | ✅ | — |
AI providers
| Provider | Tier | Notes |
|---|---|---|
| Ollama | Free | ollama pull llama3.2 |
| LM Studio / vLLM | Free | Any OpenAI-compatible server |
| OpenAI | Pro | GPT 5.3 recommended |
| Anthropic | Pro | Claude 4.6 Sonnet recommended |
| Google Gemini | Pro | Gemini 2.5 Pro recommended |
Local providers run entirely on-device. No code leaves your machine unless you configure a cloud provider.
Pro — free during beta
Pro unlocks cloud providers and AI-free hardware profiling.
Keys are being distributed manually during the beta.
Request a key → tally.so/r/xXZ9YE
coreinsight configure --pro-key <your-key>
Privacy
- Local providers — nothing leaves your machine
- Cloud providers — only the function code you analyse is sent to the provider API, under your own key
- The memory store lives at
~/.coreinsight/memory_dbon your filesystem
Troubleshooting
Docker not running
open Docker Desktop, or: sudo systemctl start docker
Ollama model not found
ollama pull llama3.2
ChromaDB / SQLite error
pip install pysqlite3-binary
Links
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file coreinsight_cli-0.3.2.tar.gz.
File metadata
- Download URL: coreinsight_cli-0.3.2.tar.gz
- Upload date:
- Size: 83.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
60cdd64a383f780b3258fdfbffd1a3d694797254b03a7a637cf51d2466bbb676
|
|
| MD5 |
1c7838d3ac8a763b60dce1eaf529e7e1
|
|
| BLAKE2b-256 |
da481a35366d0e3fc2e36cbea811a326977bed3b12893d8c368e6ff090d612fe
|
Provenance
The following attestation bundles were made for coreinsight_cli-0.3.2.tar.gz:
Publisher:
publish.yml on Prais3/coreinsight_cli
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
coreinsight_cli-0.3.2.tar.gz -
Subject digest:
60cdd64a383f780b3258fdfbffd1a3d694797254b03a7a637cf51d2466bbb676 - Sigstore transparency entry: 1258319373
- Sigstore integration time:
-
Permalink:
Prais3/coreinsight_cli@9e2d6092b453c158e82bef4b122aed1ef4b483c7 -
Branch / Tag:
refs/tags/v0.3.2 - Owner: https://github.com/Prais3
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@9e2d6092b453c158e82bef4b122aed1ef4b483c7 -
Trigger Event:
push
-
Statement type:
File details
Details for the file coreinsight_cli-0.3.2-py3-none-any.whl.
File metadata
- Download URL: coreinsight_cli-0.3.2-py3-none-any.whl
- Upload date:
- Size: 88.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
81cb0c209eb76e413113176a882a1e8bfc17a93199090f0059f11b83765929e6
|
|
| MD5 |
97d2873e1be6e164f64d696ad9f32242
|
|
| BLAKE2b-256 |
da419c752e788b1c108d8948965f5b39d61897a16c780b7e4a0a4762b94810f1
|
Provenance
The following attestation bundles were made for coreinsight_cli-0.3.2-py3-none-any.whl:
Publisher:
publish.yml on Prais3/coreinsight_cli
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
coreinsight_cli-0.3.2-py3-none-any.whl -
Subject digest:
81cb0c209eb76e413113176a882a1e8bfc17a93199090f0059f11b83765929e6 - Sigstore transparency entry: 1258319375
- Sigstore integration time:
-
Permalink:
Prais3/coreinsight_cli@9e2d6092b453c158e82bef4b122aed1ef4b483c7 -
Branch / Tag:
refs/tags/v0.3.2 - Owner: https://github.com/Prais3
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@9e2d6092b453c158e82bef4b122aed1ef4b483c7 -
Trigger Event:
push
-
Statement type: