Skip to main content

AI-powered autonomous novel generation system — install with 'pip install novelscribe', run with 'scribe'

Project description

NovelScribe

CI PyPI version Python versions License

An AI-powered autonomous novel generation agent. Give it a concept, pick a genre, and it writes a full novel — outline, characters, world-building, chapters, and all.

Language / 语言 / 語言: English | 简体中文 | 繁體中文


Install

pipx install novelscribe

That's it. The scribe command is now available.

Why pipx? On macOS and many Linux distros, pip install is blocked by the system Python (externally-managed-environment error). pipx installs into an isolated virtualenv, avoiding this entirely. Install pipx first if you don't have it: brew install pipx && pipx ensurepath (macOS) or sudo apt install pipx && pipx ensurepath (Linux).

Other install methods
pip install --user novelscribe    # user-level (may warn on macOS)

Or the one-liner script (auto-detects pipx/pip):

curl -sSL https://raw.githubusercontent.com/huangjien/wst/main/novel_harness/install.sh | bash

30-Second Demo

# Create a new novel project
scribe new my-novel

# Run fully autonomous generation (outline → characters → world → chapters)
scribe autonomous start my-novel

# Check progress
scribe --help

What It Does

NovelScribe chains 9 specialized AI agents into an autonomous pipeline:

discuss → genre → characters → world-building → outline → chapter-planning → writing → continuity → editing

Each agent is a Markdown file with a system prompt and model config. The orchestrator runs them step-by-step through YAML-defined workflows, with verification loops that auto-fix issues until quality passes.

Key Features

  • Autonomous generation — one command produces a complete novel
  • 9 specialized agents — writer, editor, character designer, world-builder, continuity checker, etc.
  • Verification loops — auto-detects and fixes plot holes, inconsistencies, and quality issues
  • Quality gates — Bronze / Silver / Gold / Platinum tiers with metrics and benchmarks
  • Parallel execution — runs independent agents concurrently for 2-4x speedup
  • LLM streaming — real-time token streaming on all 5 providers
  • Circuit breakers — automatic failover between LLM providers on persistent failures
  • Schema versioning — forward-compatible agent/workflow definitions with migration support
  • Run logging — structured JSONL logs with scribe logs run <project> command
  • Scales to 3M words — hierarchical summaries, memory optimization, chapter pagination, phase-level caching
  • Multi-language — English, 简体中文, 繁體中文 (with fallback)
  • 5 LLM providers — OpenAI, Claude, Gemini, Ollama, LM Studio

Usage

Create a Novel

scribe new my-novel --provider openai --model gpt-4

Run Autonomous Generation

scribe autonomous start my-novel

Run Individual Phases

# Generate the outline
scribe run workflow outline --project my-novel

# Run a single agent
scribe run agent writer_agent --project my-novel

# Preview without executing
scribe run workflow writing --project my-novel --dry-run

Explore Agents & Workflows

scribe agents list
scribe agents show writer_agent

scribe workflows list
scribe workflows show outline

Templates

scribe templates list
scribe templates show heros_journey --category plot
scribe template-create plot -n "My Structure" -d "Description" -b "0:Hook:Description"

Version & Updates

scribe version info              # current version
scribe version check             # check PyPI for updates
scribe version update            # self-update to latest
scribe version update --yes      # skip confirmation

Run Logs

scribe logs run <project>        # view structured run logs

NovelScribe checks for updates on startup (once per 24h) and shows a notification when a new version is available.

Multi-Language Support

Language Code
English en
Simplified Chinese zh-CN
Traditional Chinese zh-TW

Set per-project in META.yaml:

project_name: my_novel
language: zh-CN
genre: fantasy
target_word_count: 80000

Or per-command:

scribe --lang zh-TW agents list

Language resolution priority: --lang flag > META.yaml > user config > SCRIBE_LANGUAGE env > system locale > English fallback.

Configuration

Set your LLM provider:

# Interactive setup
scribe config setup

# Or set directly
export OPENAI_API_KEY=sk-...
export ANTHROPIC_API_KEY=sk-ant-...

Supported providers:

Provider Env Var Notes
OpenAI OPENAI_API_KEY Best quality
Anthropic ANTHROPIC_API_KEY Long context
Google Gemini GOOGLE_API_KEY Free tier available
Ollama Local, no API key needed
LM Studio Local, no API key needed

Architecture

novel_harness/
├── cli.py                 # Entry point (registered as 'scribe')
├── cli_*.py               # Command modules (agents, workflows, run, autonomous, logs, etc.)
├── core/                  # Core engine
│   ├── orchestrator.py    # Central coordinator
│   ├── llm_client.py      # Multi-provider LLM abstraction (with streaming)
│   ├── circuit_breaker.py # Per-provider circuit breakers with auto-fallback
│   ├── optimizing_llm_client.py # LLM call wrapper with cache + cost tracking
│   ├── model_registry.py  # Per-model token limit registry
│   ├── schema_versions.py # Schema versioning for agent/workflow definitions
│   ├── run_logger.py      # Structured JSONL run logging
│   ├── phase_context_cache.py # Per-phase artifact pre-loading
│   ├── storage.py         # File-based storage
│   ├── version.py         # Update checker & self-updater
│   ├── performance_profiler.py        # Profiling facade
│   ├── performance_profiler_core.py   # Profiling implementation
│   ├── performance_profiler_types.py  # Profiling data types
│   └── ...
├── agents/                # Agent definitions (Markdown, schema_version: 1)
├── workflows/             # Workflow definitions (YAML, schema_version: 1)
├── api/                   # FastAPI server with WebSocket streaming (optional)
└── tests/                 # Test suite (5187 tests, 96% coverage)

LLM Providers

NovelScribe supports any OpenAI-compatible API. Configure via environment variables or scribe config setup:

# .env file
OPENAI_API_KEY=sk-...
LLM_PROVIDER=openai
LLM_MODEL=gpt-4-turbo-preview

API Server (Optional)

For web UI integration or programmatic access:

pip install "novelscribe[api]"
uvicorn api.app:app --host 0.0.0.0 --port 8000

Development

git clone https://github.com/huangjien/wst.git
cd novelscribe/novel_harness
uv sync --all-extras

# Run quality checks
make check

# Individual
make format      # ruff format
make lint        # ruff check --fix
make type-check  # mypy
make test        # pytest with coverage
make build       # uv build → dist/

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

novelscribe-0.2.26.tar.gz (748.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

novelscribe-0.2.26-py3-none-any.whl (426.7 kB view details)

Uploaded Python 3

File details

Details for the file novelscribe-0.2.26.tar.gz.

File metadata

  • Download URL: novelscribe-0.2.26.tar.gz
  • Upload date:
  • Size: 748.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for novelscribe-0.2.26.tar.gz
Algorithm Hash digest
SHA256 d3bdd484a98c6d4da478b9e811c475b1c5cde301d3c8b6b4bf765d3e2c6d7481
MD5 22b74b4337f602469fcda6f82aef341f
BLAKE2b-256 e50b42e237f1e7021f4e1194f1d6cc17a98cb38a2aa3304eb8e07cb2aa250dea

See more details on using hashes here.

Provenance

The following attestation bundles were made for novelscribe-0.2.26.tar.gz:

Publisher: cd.yml on huangjien/wst

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file novelscribe-0.2.26-py3-none-any.whl.

File metadata

  • Download URL: novelscribe-0.2.26-py3-none-any.whl
  • Upload date:
  • Size: 426.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for novelscribe-0.2.26-py3-none-any.whl
Algorithm Hash digest
SHA256 9e09c9c35393ce706908391c4f0ba8cb72a6e0676bd3d657a19679373e6c10e0
MD5 7e1e455530c3fa41413d879c9f252637
BLAKE2b-256 75650cc8273b17d3788c7e6a0483dfadf81f6df431dd547d576cf77387b1f250

See more details on using hashes here.

Provenance

The following attestation bundles were made for novelscribe-0.2.26-py3-none-any.whl:

Publisher: cd.yml on huangjien/wst

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page