Skip to main content

Numinous Crunch Starter Package

Project description

Numinous Crunch Challenge

A real-time binary event forecasting competition powered by Numinous (Bittensor Subnet 6) and hosted on CrunchDAO.

Numinous is a forecasting protocol that aggregates agents into superhuman LLM forecasters. In this competition, models predict the probability that real-world events — sourced from Polymarket — resolve "Yes". Predictions are scored using the Brier score, a strictly proper scoring rule that rewards calibrated, honest probabilities.

Install

pip install crunch-numinous

What You Must Predict

For each event, you receive structured data and must return a probability between 0.0 and 1.0 that the event resolves "Yes":

# Input: event data pushed to your model (aligned with Numinous Subnet 6 validator payload)
{
    "event_id": "numinous-12345",
    "event_type": "llm",
    "title": "Will X happen by Y?",
    "description": "...",            # optional
    "cutoff": "2026-03-16T00:00:00Z",# optional, ISO 8601
    "metadata": {"market_type": "LLM", "topics": ["Finance"]}
}

# Output: your probability forecast
{"event_id": "numinous-12345", "prediction": 0.72, "reasoning": "Based on..."}
  • prediction = 1.0 → certain "Yes"
  • prediction = 0.0 → certain "No"
  • prediction = 0.5 → maximum uncertainty

Predictions are clipped to [0.01, 0.99] during scoring.

Scoring

Predictions are evaluated using the Brier score:

$$ \text{Brier} = (\text{prediction} - \text{outcome})^2 $$

Lower is better.

Score Meaning
0.00 Perfect prediction
0.25 Always predicting 0.5 (no information)
1.00 Worst possible (predicted 1.0, outcome was 0)

The Brier score is strictly proper — the optimal strategy is to report your honest probability estimate.

Missing predictions are imputed as 0.5 → scored at 0.25.

Leaderboard ranking is based on brier_72h — the 72-hour rolling average Brier score (ascending, lower is better).

Create Your Tracker

A tracker is a model that receives event data and returns probability forecasts. It operates incrementally: events are pushed via feed_update(), and predictions are requested via predict().

To participate, subclass TrackerBase and implement _predict():

from numinous.tracker import TrackerBase


class MyModel(TrackerBase):

    def _predict(self, subject):
        data = self._get_data(subject)
        if not isinstance(data, dict):
            return {"event_id": subject, "prediction": 0.5}

        event_id = data.get("event_id", subject)
        # Your logic here
        prediction = 0.5

        return {"event_id": event_id, "prediction": prediction}

How It Works

  1. feed_update(data) is called with new event data — stored automatically by TrackerBase
  2. predict(subject, ...) is called — use self._get_data(subject) to access the latest event data

Available Event Fields

Inside _predict(), self._get_data(subject) gives you:

Field Type Description
event_id str Unique event identifier
event_type str Market type, lowercased (e.g. "llm", "sports", "crypto")
title str The question being asked
description str | None Additional context and resolution criteria
cutoff str | None ISO 8601 resolution deadline
metadata dict Event metadata: market_type, topics, trigger_name, polymarket_market_id

Example

See the quickstart notebook to get started.

Gateway

Your model has no direct internet access in production. All external calls (LLMs, search, OSINT...) must go through the gateway, a local proxy to multiple AI providers.

  • In production: SANDBOX_PROXY_URL is set automatically and points to the Crunch gateway — API costs are covered by Crunch.
  • Locally: you run the gateway yourself with your own API keys. Most providers offer a free tier.

Start the gateway locally

crunch-numinous gateway restart

API keys

API keys are only needed for local testing — do not include them in the notebook you submit.

You can set them in two ways:

Option 1 — Environment variables (e.g. in a notebook cell you won't submit):

import os
os.environ["OPENAI_API_KEY"] = "sk-..."
os.environ["OPENROUTER_API_KEY"] = "sk-or-..."

Option 2 — A persistent env file that you never submit:

# ~/.crunch-numinous-gateway.env
OPENAI_API_KEY=sk-...
OPENROUTER_API_KEY=sk-or-...
CHUTES_API_KEY=...

You can also create it interactively:

crunch-numinous gateway configure

Use the gateway in your tracker

In your model, call the gateway via SANDBOX_PROXY_URL:

import os, httpx, uuid

GATEWAY_URL = os.environ.get("SANDBOX_PROXY_URL", "http://localhost:8090")

resp = httpx.post(
    f"{GATEWAY_URL}/api/gateway/openai/responses",
    json={
        "model": "gpt-5-nano",
        "input": [{"role": "user", "content": "Will BTC hit 100k?"}],
    },
    timeout=30.0,
)

See the API Reference for all available endpoints and providers.

Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

crunch_numinous-0.5.0.tar.gz (151.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

crunch_numinous-0.5.0-py3-none-any.whl (57.4 kB view details)

Uploaded Python 3

File details

Details for the file crunch_numinous-0.5.0.tar.gz.

File metadata

  • Download URL: crunch_numinous-0.5.0.tar.gz
  • Upload date:
  • Size: 151.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for crunch_numinous-0.5.0.tar.gz
Algorithm Hash digest
SHA256 a6cf6cf24f0776d2e3faedc8d64291f5a7616ded8db06fc71ca99717cb66faf6
MD5 5b694da75d5ec776b2ae3fa8a2ad9157
BLAKE2b-256 0e2ab056e2b54cea156fe1d306cbcbad9b1d7ea8441b7b11b6c780c83017c234

See more details on using hashes here.

File details

Details for the file crunch_numinous-0.5.0-py3-none-any.whl.

File metadata

File hashes

Hashes for crunch_numinous-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7d334cef582427710c4184800b61f0dd429c22c372bd12cbabca2d8bd77302b9
MD5 906a2b2994f983a694a8a4ef5ab21a73
BLAKE2b-256 36e4ce931f55baaf02fff6c7f8bb693804bc875e20d9c7c676146df7d1a903e9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page