Local-first AI performance profiler that mathematically verifies optimizations for Python, C++, and CUDA
Project description
CoreInsight CLI
CoreInsight is a local-first, hardware-aware AI performance profiler. It shifts performance engineering "left" by parsing your Python, C++, and CUDA code, identifying hardware bottlenecks (like CPU cache thrashing or CUDA warp divergence), and mathematically verifying AI-generated optimizations inside secure Docker sandboxes.
Prerequisites
- Python 3.9+
- Docker Desktop / Docker Engine (Must be running for the sandbox verification)
- Install Docker: https://docs.docker.com/engine/install/
- Ollama (Optional, if using local models) or API keys for cloud models.
- Suggested: Setup OpenAI/Anthropic/Google API keys to use those models
Install
pip install coreinsight-cli
Usage
1. Build Locally: Clone this repository and install it in editable mode:
pip install -e .
2. Configure CoreInsight CLI: Set up your preferred AI provider (Ollama, local vLLM, OpenAI, Anthropic, or Gemini):
coreinsight configure
3. Build Global Context (Recommended for multiple files): Index your repository so the AI understands your custom structs, classes, and dependencies across files:
coreinsight index
4. Test on a file: Analyze a specific file. The CLI will extract hot loops, process them in parallel, verify optimizations in Docker, and output a live Markdown report.
coreinsight analyze <file_name>
5. Project-Wide Hotspot Scanning: Instead of guessing which files are slow, scan your entire repository. CoreInsight will use static AST analysis to rank the most complex, deeply-nested loops in your project.
coreinsight scan
Build the Project
Download build:
pip install build
Run the build command to generate wheel file:
python -m build --wheel
To build project elsewhere using wheel file:
pip install dist/coreinsight_cli-*.whl
Architecture Notes
CoreInsight runs 100% locally. Code is only transmitted to the AI provider you configure. If you use Ollama or a local server, your proprietary code never leaves your machine.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file coreinsight_cli-0.1.1.tar.gz.
File metadata
- Download URL: coreinsight_cli-0.1.1.tar.gz
- Upload date:
- Size: 27.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6a93c7549c8cb1f747e7474e80dd62ff4263ff2240892b0cb18045920ed1c95b
|
|
| MD5 |
944479a1756e89c9dfc89db26bac984d
|
|
| BLAKE2b-256 |
c38d1e64c7092cda6f949ee0863e9571f94b04b48085c8aad8e8f32912cf1495
|
File details
Details for the file coreinsight_cli-0.1.1-py3-none-any.whl.
File metadata
- Download URL: coreinsight_cli-0.1.1-py3-none-any.whl
- Upload date:
- Size: 29.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
effa383b5e2a3aa019623cb378cd19567f723942ef3281fe415bbc2f378e34c8
|
|
| MD5 |
1b651b2608f4f4f87a21f31bdeacb8e5
|
|
| BLAKE2b-256 |
845d940e891cc478b86dd82f8f1dc3ff265841bf194590deafda57004433de2b
|