Skip to main content

Local-first AI performance profiler that mathematically verifies optimizations for Python, C++, and CUDA

Project description

CoreInsight

AI-powered performance profiler for Python, C++, and CUDA.

CoreInsight finds hardware bottlenecks in your code, generates optimized replacements, and verifies the speedup mathematically inside an isolated Docker sandbox — all running locally on your machine.


Install

pip install coreinsight-cli

Requirements: Python 3.9+ · Docker Desktop · Ollama (for local inference)


Quick start

# Configure your AI provider (defaults to Ollama + llama3.2)
coreinsight configure

# Run the built-in demo
coreinsight demo

# Analyse your own file
coreinsight analyze path/to/your_file.py

What it does

CoreInsight runs a full optimization pipeline on every function it extracts:

  1. Bottleneck analysis
  2. Code generation
  3. Sandbox verification
  4. Hardware profiling

Every result is stored in a local vector database. On repeat analyses, matching patterns are recalled instantly — no LLM call, no sandbox spin-up.


Commands

Command Description
coreinsight analyze <file> Analyse a .py, .cpp, or .cu file
coreinsight demo [--lang cpp] Run on a built-in example
coreinsight configure Set up AI provider and API keys
coreinsight configure --pro-key <key> Activate Pro tier
coreinsight memory Inspect stored optimizations
coreinsight memory --clear Wipe the memory store
coreinsight memory --export out.csv Export memory to CSV or Markdown
coreinsight index [--dir <path>] Index a repo for cross-file RAG context
coreinsight scan [--dir <path>] Rank hotspots by complexity without LLM
coreinsight view Launch the interactive TUI

All commands accept --no-docker to skip sandboxing when Docker is unavailable.


Supported languages

Language Analysis Benchmarking Correctness
Python
C++
CUDA

AI providers

Provider Tier Notes
Ollama Free ollama pull llama3.2
LM Studio / vLLM Free Any OpenAI-compatible server
OpenAI Pro GPT 5.3 recommended
Anthropic Pro Claude 4.6 Sonnet recommended
Google Gemini Pro Gemini 2.5 Pro recommended

Local providers run entirely on-device. No code leaves your machine unless you configure a cloud provider.


Pro — free during beta

Pro unlocks cloud providers and AI-free hardware profiling.
Keys are being distributed manually during the beta.

Request a key → tally.so/r/xXZ9YE

coreinsight configure --pro-key <your-key>

Privacy

  • Local providers — nothing leaves your machine
  • Cloud providers — only the function code you analyse is sent to the provider API, under your own key
  • The memory store lives at ~/.coreinsight/memory_db on your filesystem

Troubleshooting

Docker not running

open Docker Desktop, or: sudo systemctl start docker

Ollama model not found

ollama pull llama3.2

ChromaDB / SQLite error

pip install pysqlite3-binary

Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

coreinsight_cli-0.3.0.tar.gz (79.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

coreinsight_cli-0.3.0-py3-none-any.whl (83.8 kB view details)

Uploaded Python 3

File details

Details for the file coreinsight_cli-0.3.0.tar.gz.

File metadata

  • Download URL: coreinsight_cli-0.3.0.tar.gz
  • Upload date:
  • Size: 79.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for coreinsight_cli-0.3.0.tar.gz
Algorithm Hash digest
SHA256 0a66a7b6ea82fb35c8a8d83a7279baed1ada0d91773eccfb4f9aa30bfb751081
MD5 79f8399d1e1b80cb7887601b097a77e4
BLAKE2b-256 7a2eb7a94c328cc70c4724a3c3ff9bada87cdffcff1c0ca6a1d5188e99abee32

See more details on using hashes here.

Provenance

The following attestation bundles were made for coreinsight_cli-0.3.0.tar.gz:

Publisher: publish.yml on Prais3/coreinsight_cli

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file coreinsight_cli-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for coreinsight_cli-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7c68309acfb2d3955c37808964bbee2d8b8e6d9e42b6c35eba7c1992844f421f
MD5 2077ef988021f8f6072052693836f0f5
BLAKE2b-256 ac99fe07d5cd9a0be56ca8c90bf3e3f5a7439f46bd6a1db0beaa1e083008c03a

See more details on using hashes here.

Provenance

The following attestation bundles were made for coreinsight_cli-0.3.0-py3-none-any.whl:

Publisher: publish.yml on Prais3/coreinsight_cli

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page