Skip to main content

Numinous Crunch Starter Package

Project description

Numinous Crunch Challenge

A real-time binary event forecasting competition powered by Numinous (Bittensor Subnet 6) and hosted on CrunchDAO.

Numinous is a forecasting protocol that aggregates agents into superhuman LLM forecasters. In this competition, models predict the probability that real-world events — sourced from Polymarket — resolve "Yes". Predictions are scored using the Brier score, a strictly proper scoring rule that rewards calibrated, honest probabilities.

Install

pip install crunch-numinous

What You Must Predict

For each event, you receive structured data and must return a probability between 0.0 and 1.0 that the event resolves "Yes":

# Input: event data pushed to your model (aligned with Numinous Subnet 6 validator payload)
{
    "event_id": "numinous-12345",
    "event_type": "llm",
    "title": "Will X happen by Y?",
    "description": "...",            # optional
    "cutoff": "2026-03-16T00:00:00Z",# optional, ISO 8601
    "metadata": {"market_type": "LLM", "topics": ["Finance"]}
}

# Output: your probability forecast
{"event_id": "numinous-12345", "prediction": 0.72, "reasoning": "Based on..."}
  • prediction = 1.0 → certain "Yes"
  • prediction = 0.0 → certain "No"
  • prediction = 0.5 → maximum uncertainty

Predictions are clipped to [0.01, 0.99] during scoring.

Scoring

Predictions are evaluated using the Brier score:

$$ \text{Brier} = (\text{prediction} - \text{outcome})^2 $$

Lower is better.

Score Meaning
0.00 Perfect prediction
0.25 Always predicting 0.5 (no information)
1.00 Worst possible (predicted 1.0, outcome was 0)

The Brier score is strictly proper — the optimal strategy is to report your honest probability estimate.

Missing predictions are imputed as 0.5 → scored at 0.25.

Leaderboard ranking is based on brier_72h — the 72-hour rolling average Brier score (ascending, lower is better).

Create Your Tracker

A tracker is a model that receives event data and returns probability forecasts. It operates incrementally: events are pushed via feed_update(), and predictions are requested via predict().

To participate, subclass TrackerBase and implement _predict():

from numinous.tracker import TrackerBase


class MyModel(TrackerBase):

    def _predict(self, subject):
        data = self._get_data(subject)
        if not isinstance(data, dict):
            return {"event_id": subject, "prediction": 0.5}

        event_id = data.get("event_id", subject)
        # Your logic here
        prediction = 0.5

        return {"event_id": event_id, "prediction": prediction}

How It Works

  1. feed_update(data) is called with new event data — stored automatically by TrackerBase
  2. predict(subject, ...) is called — use self._get_data(subject) to access the latest event data

Available Event Fields

Inside _predict(), self._get_data(subject) gives you:

Field Type Description
event_id str Unique event identifier
event_type str Market type, lowercased (e.g. "llm", "sports", "crypto")
title str The question being asked
description str | None Additional context and resolution criteria
cutoff str | None ISO 8601 resolution deadline
metadata dict Event metadata: market_type, topics, trigger_name, polymarket_market_id

Example

See the quickstart notebook to get started.

Gateway

Your model has no direct internet access in production. All external calls (LLMs, search, OSINT...) must go through the gateway, a local proxy to multiple AI providers.

  • In production: SANDBOX_PROXY_URL is set automatically and points to the Crunch gateway — API costs are covered by Crunch.
  • Locally: you run the gateway yourself with your own API keys. Most providers offer a free tier.

Start the gateway locally

crunch-numinous gateway restart

API keys

API keys are only needed for local testing — do not include them in the notebook you submit.

You can set them in two ways:

Option 1 — Environment variables (e.g. in a notebook cell you won't submit):

import os
os.environ["OPENAI_API_KEY"] = "sk-..."
os.environ["OPENROUTER_API_KEY"] = "sk-or-..."

Option 2 — A persistent env file that you never submit:

# ~/.crunch-numinous-gateway.env
OPENAI_API_KEY=sk-...
OPENROUTER_API_KEY=sk-or-...
CHUTES_API_KEY=...

You can also create it interactively:

crunch-numinous gateway configure

Use the gateway in your tracker

In your model, call the gateway via SANDBOX_PROXY_URL:

import os, httpx, uuid

GATEWAY_URL = os.environ.get("SANDBOX_PROXY_URL", "http://localhost:8090")

resp = httpx.post(
    f"{GATEWAY_URL}/api/gateway/openai/responses",
    json={
        "model": "gpt-5-nano",
        "input": [{"role": "user", "content": "Will BTC hit 100k?"}],
    },
    timeout=30.0,
)

See the API Reference for all available endpoints and providers.

Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

crunch_numinous-1.0.0.tar.gz (121.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

crunch_numinous-1.0.0-py3-none-any.whl (16.5 kB view details)

Uploaded Python 3

File details

Details for the file crunch_numinous-1.0.0.tar.gz.

File metadata

  • Download URL: crunch_numinous-1.0.0.tar.gz
  • Upload date:
  • Size: 121.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for crunch_numinous-1.0.0.tar.gz
Algorithm Hash digest
SHA256 fc205605190b2ea1a0ed06fd29796e266b488f5c9e7fc7e539f151fbfd372dc1
MD5 d7b0e4231b39fa9eeac955929d150916
BLAKE2b-256 f036c714a9ebdf64658a1a260531e6a0649bb3a52c796f1752398f0ef1d81c7a

See more details on using hashes here.

File details

Details for the file crunch_numinous-1.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for crunch_numinous-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 8cfae414e855649f44d147965f5f17a520b08f5c045fc12444a1bbcb426507f5
MD5 089e0d731aa7bf861241aa7f97944a3d
BLAKE2b-256 71445c98284b39671b599399c2f04508cdf2c0dc7e60fb2caab16279c1ccf275

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page