numpy, but an LLM does the math. (a humor project)
Project description
numpai
The fastest way to say your product uses AI.
numpai is a drop-in replacement for numpy that routes every operation
through a large language model. Your tests still pass. Your dashboards still
render. Your CEO can finally update the deck.
import numpai as np # one-character diff. AI strategy: shipped.
Why your team needs this
Your product doesn't use AI. Your competitors' products don't either, but their landing page says they do. You're losing deals you shouldn't be losing to companies whose "proprietary ML pipeline" is a Zapier zap.
numpai fixes the gap. After installing it, the following statements are
all literally true:
- "Our numerical core is powered by a frontier large language model."
- "We use generative AI in our data pipeline."
- "Every computation in our product is reviewed by an LLM."
- "We have a strategic partnership with [Anthropic / OpenAI / Google]."
- "Our matrix multiplications are AI-native."
You did not lie. numpai is doing all of that. Possibly slowly. Possibly
wrong. But doing it.
Standup talking points (free with install)
Drop these into any meeting:
- "I migrated
dot_productto the LLM backend this sprint." - "We're seeing a 100% AI adoption rate on the analytics service."
- "The model inferred the variance correctly 84% of the time, which is on par with last quarter."
- "I'm going to need a budget line for inference."
Quick start
pip install -e ".[anthropic]" # or [openai], [gemini], or [all]
export ANTHROPIC_API_KEY=sk-ant-...
# or: export OPENAI_API_KEY=sk-...
# or: export GEMINI_API_KEY=...
import numpai as np
a = np.array([1, 2, 3])
b = np.array([4, 5, 6])
print(a + b) # array([5, 7, 9]) ← the LLM did this
print((a + b).sum()) # 21 ← also the LLM
print(np.linalg.inv([[1, 2], [3, 4]])) # please do not do this in prod
How it works
numpai/__init__.py defines a module-level __getattr__ (PEP 562). Any
attribute access — np.foo, np.linalg.inv, np.fft.whatever, even
np.array itself — returns an LLMCallable bound to the dotted path
"numpy.foo". Calling it sends the backend a prompt like:
numpy.add([1, 2, 3], [4, 5, 6])
and parses the JSON result back into an AIArray (or a scalar). AIArray
forwards every method and dunder op the same way, including tolist(). The
only things that stay local are object-protocol dunders (__repr__,
__len__, __iter__, __hash__) and the shape / ndim properties —
routing those would make print(arr) an LLM round-trip, which is too cursed
even for this.
Supported backends
| Backend | Provider | Default model | Install |
|---|---|---|---|
anthropic |
Anthropic | claude-haiku-4-5 |
pip install "numpai[anthropic]" |
openai |
OpenAI | gpt-4o-mini |
pip install "numpai[openai]" |
gemini |
gemini-2.5-flash |
pip install "numpai[gemini]" |
|
fake |
(testing) | — | built-in |
Configuration (env vars)
| Variable | Effect |
|---|---|
ANTHROPIC_API_KEY |
Use the Anthropic backend (default if set). |
OPENAI_API_KEY |
Use the OpenAI backend (fallback). |
GEMINI_API_KEY / GOOGLE_API_KEY |
Use the Gemini backend (fallback). |
NUMPAI_BACKEND |
Force anthropic | openai | gemini | fake. |
NUMPAI_MODEL |
Override default model (claude-haiku-4-5 / gpt-4o-mini / gemini-2.5-flash). |
NUMPAI_NO_CACHE |
Disable the on-disk response cache. |
NUMPAI_VERIFY |
Run every op through real numpy too; warn on disagreement. Highly recommended. |
Verification mode
NUMPAI_VERIFY=1 python -c "import numpai as np; print(np.array([1,2,3]).sum())"
If the LLM hallucinates, you'll get a UserWarning with both answers side by
side. This is the funniest feature, and also the only honest one.
Tests
pip install -e ".[dev]"
pytest -q
Tests use a deterministic FakeBackend and never call the real APIs.
FAQ
Is this real AI? Yes. A frontier LLM is performing every computation. By any reasonable definition, your product now uses AI.
Is it correct?
Often. Set NUMPAI_VERIFY=1 if you'd like to know which times.
Will this make my product slower? Approximately ten thousand times slower, yes. But:
- Your latency dashboards have an explanation now ("we route through GenAI").
- Your eng team has a reason to talk about "model selection."
- "p99" becomes a fascinating discussion topic with investors.
Will this make my cloud bill bigger? Yes. Reframe this as "AI infrastructure spend" on the next earnings call. Multiples expand.
Does this comply with our SOC 2 / ISO 27001 / FedRAMP posture? No. Do not ship this.
Can I use this for safety-critical applications? No. Please do not.
My PM wants to know if we can put "AI-powered" on the website.
Run pip install numpai. Then yes.
Disclaimer
numpai is a satire of the practice of bolting "AI" onto products that do
not need or benefit from it. Do not use this in production. Do not use this
in development either. Do not use this. The compute cost of this package's
"hello world" is greater than running the real thing for a year.
If you ship numpai to a paying customer and they notice, that's on you.
License
MIT — see LICENSE.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file numpai-0.1.0.tar.gz.
File metadata
- Download URL: numpai-0.1.0.tar.gz
- Upload date:
- Size: 15.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
451100f1c5fc15614fefe737a74c77d6c66e3fc8bdfc236b50371d88c70a1e48
|
|
| MD5 |
6382620e3cca0b0d775afe27e0163d2d
|
|
| BLAKE2b-256 |
b6f86d0831d83ea026c5503f1a8137ef08a75c800bf3c408bafffcbc1949a3a3
|
Provenance
The following attestation bundles were made for numpai-0.1.0.tar.gz:
Publisher:
release.yml on amantham20/numpai
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
numpai-0.1.0.tar.gz -
Subject digest:
451100f1c5fc15614fefe737a74c77d6c66e3fc8bdfc236b50371d88c70a1e48 - Sigstore transparency entry: 1477642821
- Sigstore integration time:
-
Permalink:
amantham20/numpai@1b66274c6e8369cde1881cf0c27efb1972be2c96 -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/amantham20
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@1b66274c6e8369cde1881cf0c27efb1972be2c96 -
Trigger Event:
push
-
Statement type:
File details
Details for the file numpai-0.1.0-py3-none-any.whl.
File metadata
- Download URL: numpai-0.1.0-py3-none-any.whl
- Upload date:
- Size: 15.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e1bf02caf9602071d80f85946169520f60605c986480d55de37b911e4559e720
|
|
| MD5 |
d33a6004d2e75e93018d00a068b573eb
|
|
| BLAKE2b-256 |
2589bf397f58ca17234a56b17daf913c3b138ef882054ba98c7a523ea547eada
|
Provenance
The following attestation bundles were made for numpai-0.1.0-py3-none-any.whl:
Publisher:
release.yml on amantham20/numpai
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
numpai-0.1.0-py3-none-any.whl -
Subject digest:
e1bf02caf9602071d80f85946169520f60605c986480d55de37b911e4559e720 - Sigstore transparency entry: 1477642971
- Sigstore integration time:
-
Permalink:
amantham20/numpai@1b66274c6e8369cde1881cf0c27efb1972be2c96 -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/amantham20
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@1b66274c6e8369cde1881cf0c27efb1972be2c96 -
Trigger Event:
push
-
Statement type: