Skip to main content

Local-first AI performance profiler that mathematically verifies optimizations for Python, C++, and CUDA

Project description

CoreInsight

AI-powered performance profiler for Python, C++, and CUDA.

CoreInsight finds hardware bottlenecks in your code, generates optimized replacements, and verifies the speedup mathematically inside an isolated Docker sandbox — all running locally on your machine.


Install

# OpenAI key - quick install
pip install coreinsight-cli[openai]

# Gemini key - quick install
pip install coreinsight-cli[google]

# Claude key - quick install
pip install coreinsight-cli[anthropic]

# Local Ollama install
pip install coreinsight-cli

# Memory and additional usage install
pip install coreinsight-cli[openai,memory]

# Install everything
pip install coreinsight-cli[all]

Requirements: Python 3.9+ · Docker Desktop · Ollama (for local inference)


Quick start

# Configure your AI provider (defaults to Ollama + llama3.2)
coreinsight configure

# Run the built-in demo
coreinsight demo

# Analyse your own file
coreinsight analyze path/to/your_file.py

What it does

CoreInsight runs a full optimization pipeline on every function it extracts:

  1. Bottleneck analysis
  2. Code generation
  3. Sandbox verification
  4. Hardware profiling

Every result is stored in a local vector database. On repeat analyses, matching patterns are recalled instantly — no LLM call, no sandbox spin-up.


Commands

Command Description
coreinsight analyze <file> Analyse a .py, .cpp, or .cu file
coreinsight demo [--lang cpp] Run on a built-in example
coreinsight configure Set up AI provider and API keys
coreinsight configure --pro-key <key> Activate Pro tier
coreinsight memory Inspect stored optimizations
coreinsight memory --clear Wipe the memory store
coreinsight memory --export out.csv Export memory to CSV or Markdown
coreinsight index [--dir <path>] Index a repo for cross-file RAG context
coreinsight scan [--dir <path>] Rank hotspots by complexity without LLM
coreinsight view Launch the interactive TUI

All commands accept --no-docker to skip sandboxing when Docker is unavailable.


Supported languages

Language Analysis Benchmarking Correctness
Python
C++
CUDA

AI providers

Provider Tier Notes
Ollama Free ollama pull llama3.2
LM Studio / vLLM Free Any OpenAI-compatible server
OpenAI Pro GPT 5.3 recommended
Anthropic Pro Claude 4.6 Sonnet recommended
Google Gemini Pro Gemini 2.5 Pro recommended

Local providers run entirely on-device. No code leaves your machine unless you configure a cloud provider.


Pro — free during beta

Pro unlocks cloud providers and AI-free hardware profiling.
Keys are being distributed manually during the beta.

Request a key → tally.so/r/xXZ9YE

coreinsight configure --pro-key <your-key>

Privacy

  • Local providers — nothing leaves your machine
  • Cloud providers — only the function code you analyse is sent to the provider API, under your own key
  • The memory store lives at ~/.coreinsight/memory_db on your filesystem

Troubleshooting

Docker not running

open Docker Desktop, or: sudo systemctl start docker

Ollama model not found

ollama pull llama3.2

ChromaDB / SQLite error

pip install pysqlite3-binary

Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

coreinsight_cli-0.3.3.tar.gz (84.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

coreinsight_cli-0.3.3-py3-none-any.whl (91.6 kB view details)

Uploaded Python 3

File details

Details for the file coreinsight_cli-0.3.3.tar.gz.

File metadata

  • Download URL: coreinsight_cli-0.3.3.tar.gz
  • Upload date:
  • Size: 84.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for coreinsight_cli-0.3.3.tar.gz
Algorithm Hash digest
SHA256 a49bc3fe308c1a50213a224272080d4ea301fd04f7243c930e465a1a1eb3bbf2
MD5 530774cb938e0658b2a551ef4271dfb6
BLAKE2b-256 0c40f3853698a4b7030b5258dc74881ee46bca210e2af6af05adf36db6a92420

See more details on using hashes here.

Provenance

The following attestation bundles were made for coreinsight_cli-0.3.3.tar.gz:

Publisher: publish.yml on Prais3/coreinsight_cli

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file coreinsight_cli-0.3.3-py3-none-any.whl.

File metadata

  • Download URL: coreinsight_cli-0.3.3-py3-none-any.whl
  • Upload date:
  • Size: 91.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for coreinsight_cli-0.3.3-py3-none-any.whl
Algorithm Hash digest
SHA256 9579cb9f90f4feabd683f7851b0fe829831874991d462b75e1d77115f8c88773
MD5 5b18c70fcce2b09f281d25ae121e5133
BLAKE2b-256 94fe7caf399feff6ea29322631407b06d62639db08c9e649550ee8d59bfbefad

See more details on using hashes here.

Provenance

The following attestation bundles were made for coreinsight_cli-0.3.3-py3-none-any.whl:

Publisher: publish.yml on Prais3/coreinsight_cli

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page