Skip to main content

Numinous Crunch Starter Package

Project description

Numinous Crunch Challenge

A real-time binary event forecasting competition powered by Numinous (Bittensor Subnet 6) and hosted on CrunchDAO.

Numinous is a forecasting protocol that aggregates agents into superhuman LLM forecasters. In this competition, models predict the probability that real-world events — sourced from Polymarket — resolve "Yes". Predictions are scored using the Brier score, a strictly proper scoring rule that rewards calibrated, honest probabilities.

Install

pip install crunch-numinous

What You Must Predict

For each event, you receive structured data and must return a probability between 0.0 and 1.0 that the event resolves "Yes":

# Input: event data pushed to your model (aligned with Numinous Subnet 6 validator payload)
{
    "event_id": "numinous-12345",
    "event_type": "llm",
    "title": "Will X happen by Y?",
    "description": "...",            # optional
    "cutoff": "2026-03-16T00:00:00Z",# optional, ISO 8601
    "metadata": {"market_type": "LLM", "topics": ["Finance"]}
}

# Output: your probability forecast
{"event_id": "numinous-12345", "prediction": 0.72, "reasoning": "Based on..."}
  • prediction = 1.0 → certain "Yes"
  • prediction = 0.0 → certain "No"
  • prediction = 0.5 → maximum uncertainty

Predictions are clipped to [0.01, 0.99] during scoring.

Scoring

Predictions are evaluated using the Brier score:

$$ \text{Brier} = (\text{prediction} - \text{outcome})^2 $$

Lower is better.

Score Meaning
0.00 Perfect prediction
0.25 Always predicting 0.5 (no information)
1.00 Worst possible (predicted 1.0, outcome was 0)

The Brier score is strictly proper — the optimal strategy is to report your honest probability estimate.

Missing predictions are imputed as 0.5 → scored at 0.25.

Leaderboard ranking is based on brier_72h — the 72-hour rolling average Brier score (ascending, lower is better).

Create Your Tracker

A tracker is a model that receives event data and returns probability forecasts. It operates incrementally: events are pushed via feed_update(), and predictions are requested via predict().

To participate, subclass TrackerBase and implement _predict():

from numinous.tracker import TrackerBase


class MyModel(TrackerBase):

    def _predict(self, subject):
        data = self._get_data(subject)
        if not isinstance(data, dict):
            return {"event_id": subject, "prediction": 0.5}

        event_id = data.get("event_id", subject)
        # Your logic here
        prediction = 0.5

        return {"event_id": event_id, "prediction": prediction}

How It Works

  1. feed_update(data) is called with new event data — stored automatically by TrackerBase
  2. predict(subject, ...) is called — use self._get_data(subject) to access the latest event data

Available Event Fields

Inside _predict(), self._get_data(subject) gives you:

Field Type Description
event_id str Unique event identifier
event_type str Market type, lowercased (e.g. "llm", "sports", "crypto")
title str The question being asked
description str | None Additional context and resolution criteria
cutoff str | None ISO 8601 resolution deadline
metadata dict Event metadata: market_type, topics, trigger_name, polymarket_market_id

Example

See the quickstart notebook to get started.

Gateway

Your model has no direct internet access in production. All external calls (LLMs, search, OSINT...) must go through the gateway, a local proxy to multiple AI providers.

  • In production: SANDBOX_PROXY_URL is set automatically and points to the Crunch gateway — API costs are covered by Crunch.
  • Locally: you run the gateway yourself with your own API keys. Most providers offer a free tier.

Start the gateway locally

crunch-numinous gateway restart

API keys

API keys are only needed for local testing — do not include them in the notebook you submit.

You can set them in two ways:

Option 1 — Environment variables (e.g. in a notebook cell you won't submit):

import os
os.environ["OPENAI_API_KEY"] = "sk-..."
os.environ["OPENROUTER_API_KEY"] = "sk-or-..."

Option 2 — A persistent env file that you never submit:

# ~/.crunch-numinous-gateway.env
OPENAI_API_KEY=sk-...
OPENROUTER_API_KEY=sk-or-...
CHUTES_API_KEY=...

You can also create it interactively:

crunch-numinous gateway configure

Use the gateway in your tracker

In your model, call the gateway via SANDBOX_PROXY_URL:

import os, httpx, uuid

GATEWAY_URL = os.environ.get("SANDBOX_PROXY_URL", "http://localhost:8090")

resp = httpx.post(
    f"{GATEWAY_URL}/api/gateway/openai/responses",
    json={
        "model": "gpt-5-nano",
        "input": [{"role": "user", "content": "Will BTC hit 100k?"}],
    },
    timeout=30.0,
)

See the API Reference for all available endpoints and providers.

Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

crunch_numinous-0.7.0.tar.gz (151.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

crunch_numinous-0.7.0-py3-none-any.whl (57.6 kB view details)

Uploaded Python 3

File details

Details for the file crunch_numinous-0.7.0.tar.gz.

File metadata

  • Download URL: crunch_numinous-0.7.0.tar.gz
  • Upload date:
  • Size: 151.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for crunch_numinous-0.7.0.tar.gz
Algorithm Hash digest
SHA256 6b988f120e171ee09147fe2af0e974644dd9df4585d77053177dfa81dcb10a8b
MD5 a9d554decf234a292ba12f2a6218f2ed
BLAKE2b-256 21ee84c6ad2aa6fe391d8bcedba5bb150c3421538954eb7dcadb303b31c39aee

See more details on using hashes here.

File details

Details for the file crunch_numinous-0.7.0-py3-none-any.whl.

File metadata

File hashes

Hashes for crunch_numinous-0.7.0-py3-none-any.whl
Algorithm Hash digest
SHA256 95e3a79b5d844933f576b494f8d01a9a89b511c6b438e2615aebbeb7fdeb79d1
MD5 dd83bef64564ce699e0afa8086f1b58b
BLAKE2b-256 d12214b69c8db1b18c688b77f10300f0ca28bf369000b5e5c8dc0f26432d594c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page