Skip to main content

AI-powered autonomous novel generation system — install with 'pip install novelscribe', run with 'scribe'

Project description

NovelScribe

CI PyPI version Python versions License

An AI-powered autonomous novel generation agent. Give it a concept, pick a genre, and it writes a full novel — outline, characters, world-building, chapters, and all.

Language / 语言 / 語言: English | 简体中文 | 繁體中文


Install

pipx install novelscribe

That's it. The scribe command is now available.

Why pipx? On macOS and many Linux distros, pip install is blocked by the system Python (externally-managed-environment error). pipx installs into an isolated virtualenv, avoiding this entirely. Install pipx first if you don't have it: brew install pipx && pipx ensurepath (macOS) or sudo apt install pipx && pipx ensurepath (Linux).

Other install methods
pip install --user novelscribe    # user-level (may warn on macOS)

Or the one-liner script (auto-detects pipx/pip):

curl -sSL https://raw.githubusercontent.com/huangjien/wst/main/novel_harness/install.sh | bash

30-Second Demo

# Create a new novel project
scribe new my-novel

# Run fully autonomous generation (outline → characters → world → chapters)
scribe autonomous start my-novel

# Check progress
scribe help

What It Does

NovelScribe chains 9 specialized AI agents into an autonomous pipeline:

discuss → genre → characters → world-building → outline → chapter-planning → writing → continuity → editing

Each agent is a Markdown file with a system prompt and model config. The orchestrator runs them step-by-step through YAML-defined workflows, with verification loops that auto-fix issues until quality passes.

Key Features

  • Autonomous generation — one command produces a complete novel
  • 9 specialized agents — writer, editor, character designer, world-builder, continuity checker, etc.
  • Verification loops — auto-detects and fixes plot holes, inconsistencies, and quality issues
  • Hook lifecycle framework (v2.2) — YAML-declared hooks across workflow/runtime with required vs optional semantics, strict guardrails for builtin.http / builtin.shell, and fail-closed external plugin loading
  • Resilience + orchestration upgrades (v2.3) — step retry/timeout, hook when conditions, runtime transient retry, DAG parallel execution with rollback, exponential+jitter backoff, checkpoint pruning, and trigger API
  • Reliability hardening (v2.4) — durable trigger store (journal+snapshot), HMAC webhook security, async dispatch with retry/DLQ semantics, strict-mode runtime gates, event-driven checkpoint decisions, docs consistency guard
  • Governance & schema maturity (v2.5) — secrets management with ${secrets.*} resolution, reusable step templates, workflow-level timeout, output schema validation, versioned artifact management, human-in-loop default, language-consistent output (en/zh-CN/zh-TW)
  • Quality gates — Bronze / Silver / Gold / Platinum tiers with metrics and benchmarks
  • Parallel execution — runs independent agents concurrently for 2-4x speedup
  • LLM streaming — real-time token streaming on all 8 providers
  • Circuit breakers — automatic failover between LLM providers on persistent failures
  • Schema versioning — forward-compatible agent/workflow definitions with migration support
  • Run logging — structured JSONL logs with scribe logs run <project> command
  • Scales to 3M words — hierarchical summaries, memory optimization, chapter pagination, phase-level caching
  • Multi-language — English, 简体中文, 繁體中文 (with fallback)
  • 8 LLM providers — OpenAI, Claude, Gemini, Ollama, LM Studio, MiniMax, ZhipuAI (z.ai), Moonshot (Kimi)
  • AI Chat assistantscribe chat <question> for instant answers about novel writing and scribe usage

Usage

Create a Novel

scribe new my-novel --provider openai --model gpt-4

Run Autonomous Generation

scribe autonomous start my-novel

Run Individual Phases

# Generate the outline
scribe run workflow outline --project my-novel

# Run a single agent
scribe run agent writer_agent --project my-novel

# Preview without executing
scribe run workflow writing --project my-novel --dry-run

Explore Agents & Workflows

scribe agents list
scribe agents show writer_agent

scribe workflows list
scribe workflows show outline

Templates

scribe templates list                          # List all templates + KB stats
scribe templates list character                # List KB character templates
scribe templates list character "male, brave"  # Search KB character templates
scribe templates show heros_journey --category plot
scribe templates kb search "hero journey"      # Search template knowledge base
scribe templates kb stats                      # Show KB statistics
scribe templates create plot -n "My Structure" -d "Description" -b "0:Hook:Description"

Semantic search: Install with uv sync --extra search for embedding-based KB retrieval.

Setup & Configuration

scribe setup interactive                      # Interactive setup wizard
scribe setup apikey openai sk-...             # Configure API key
scribe setup apikey ollama                    # Configure local provider (no key needed)
scribe setup apikey openai sk-... --base-url  # Custom base URL (proxy support)
scribe setup language zh-CN                   # Set default language
scribe setup install-skill                    # Install IDE chatbot skill
scribe setup verify                           # Verify installation

Checkpoint Management (Human-in-the-Loop)

scribe checkpoint list my-novel               # List checkpoints
scribe checkpoint show my-novel               # Show current proposal
scribe checkpoint approve my-novel            # Approve checkpoint
scribe checkpoint modify my-novel             # Modify and approve
scribe checkpoint reject my-novel             # Reject checkpoint
scribe checkpoint history my-novel            # Show decision history
scribe checkpoint compare my-novel            # Compare all alternatives

Proactive Suggestions

scribe proactive enable                       # Enable proactive suggestions
scribe proactive disable                      # Disable proactive suggestions
scribe proactive status                       # Show configuration
scribe proactive list                         # List available suggestions
scribe proactive history                      # Show acceptance history
scribe proactive review                       # Review pending suggestions
scribe proactive promote <id>                 # Promote suggestion to global KB

Project Management

scribe project list                           # List all novel projects

Version & Updates

scribe version info              # current version
scribe version check             # check PyPI for updates
scribe version update            # self-update to latest
scribe version update --yes      # skip confirmation

AI Chat Assistant

Ask questions about scribe usage or novel writing craft — no quotes needed:

scribe chat how to create a new character?
scribe chat what agents are available?
scribe chat --provider anthropic how to build a plot?

# Interactive multi-turn conversation
scribe chat

The AI reads all agent definitions and project documentation to give contextual answers. Supports English, 简体中文, and 繁體中文.

Run Logs

scribe logs run <project>        # view structured run logs

Character Archetypes

scribe character archetypes list              # list all built-in archetypes
scribe character archetypes show hero         # view archetype details
scribe character convert hero "John Doe"      # convert archetype to character
scribe character convert rebel "Jane" --project my-novel  # add to project

Export

scribe export manuscript my-novel --format pdf     # export as PDF
scribe export manuscript my-novel --format epub    # export as EPUB
scribe export manuscript my-novel --format docx    # export as DOCX
scribe export manuscript my-novel --format md      # export as Markdown
scribe export manuscript my-novel --format pdf --output-dir ./out/  # custom output directory

Review

scribe review start my-novel                  # run review pipeline
scribe review report my-novel                 # generate review report
scribe review formats                         # list report export formats

Quality Assurance

scribe qa analyze my-novel                    # quality metrics analysis
scribe qa validate my-novel                   # run validation rules
scribe qa benchmark my-novel                  # benchmark against standards
scribe qa gate-check my-novel --chapter 1 --level gold  # quality gate check

Parallel Execution

scribe parallel plan my-novel                 # plan parallel execution
scribe parallel execute my-novel              # execute with concurrency
scribe parallel benchmark                     # benchmark parallel speedup
scribe parallel stats                         # show execution statistics

NovelScribe checks for updates on startup (once per 24h) and shows a notification when a new version is available.

Multi-Language Support

Language Code
English en
Simplified Chinese zh-CN
Traditional Chinese zh-TW

Set per-project in META.yaml:

project_name: my_novel
language: zh-CN
genre: fantasy
target_word_count: 80000

Or per-command:

scribe --lang zh-TW agents list

Language resolution priority: --lang flag > META.yaml > user config > SCRIBE_LANGUAGE env > system locale > English fallback.

Configuration

Set your LLM provider:

# Interactive setup
scribe config get

# Or set directly
scribe config set language zh-CN
export OPENAI_API_KEY=sk-...
export ANTHROPIC_API_KEY=sk-ant-...

Supported providers:

Provider Env Var Notes
OpenAI OPENAI_API_KEY Best quality
Anthropic ANTHROPIC_API_KEY Long context
Google Gemini GOOGLE_API_KEY Free tier available
Ollama Local, no API key needed
LM Studio Local, no API key needed
MiniMax MINIMAX_API_KEY OpenAI-compatible
ZhipuAI (z.ai) ZHIPUAI_API_KEY GLM-4 models
Moonshot (Kimi) MOONSHOT_API_KEY 128K context

Architecture

novel_harness/
├── cli.py                 # Entry point (registered as 'scribe')
├── cli_*.py               # Command modules (agents, workflows, run, autonomous, logs, memory, etc.)
├── core/                  # Core engine
│   ├── orchestrator.py    # Central coordinator
│   ├── llm_client.py      # Multi-provider LLM abstraction (with streaming)
│   ├── circuit_breaker.py # Per-provider circuit breakers with auto-fallback
│   ├── optimizing_llm_client.py # LLM call wrapper with cache + cost tracking
│   ├── model_registry.py  # Per-model token limit registry
│   ├── schema_versions.py # Schema versioning for agent/workflow definitions
│   ├── run_logger.py      # Structured JSONL run logging
│   ├── phase_context_cache.py # Per-phase artifact pre-loading
│   ├── memory/            # Memory system
│   │   ├── conversation_memory.py  # Session journal (JSONL)
│   │   ├── knowledge_graph.py      # Entity relationship graph
│   │   ├── generation_learning.py   # Adaptive learning
│   │   ├── consistency_checker.py   # Graph consistency validation
│   │   ├── novel_wiki.py          # Obsidian-compatible markdown wiki
│   │   └── entity_extractor.py    # Structured-first entity extraction
│   ├── memory_bus.py      # Memory system facade (unified index/retrieve API)
│   ├── context_bus.py     # Context injection with optional memory integration
│   ├── hooks/             # Shared hook event bus + built-in handlers + plugin manager
│   ├── trigger_store.py   # Append-only journal + snapshot trigger persistence
│   ├── trigger_manager.py # Trigger registration, webhook security, async dispatch
│   ├── quality_gate.py    # Quality gate evaluation (Bronze/Silver/Gold/Platinum)
│   ├── quality_report.py  # Quality report generation
│   ├── template_kb.py     # Global template knowledge base with semantic search
│   ├── storage.py         # File-based storage
│   ├── version.py         # Update checker & self-updater
│   ├── performance_profiler.py        # Profiling facade
│   ├── performance_profiler_core.py   # Profiling implementation
│   ├── performance_profiler_types.py  # Profiling data types
│   ├── archetype_loader.py    # Archetype YAML parser and formatter
│   ├── archetype_converter.py # Archetype-to-character conversion engine
│   └── ...
├── agents/                # Agent definitions (Markdown, schema_version: 1)
├── workflows/             # Workflow definitions (YAML, schema_version: 1)
├── api/                   # FastAPI server with WebSocket streaming (optional)
└── tests/                 # Test suite (2649 tests, 100 files, 98% coverage)

LLM Providers

NovelScribe supports any OpenAI-compatible API. Configure via environment variables or scribe config setup:

# .env file
OPENAI_API_KEY=sk-...
LLM_PROVIDER=openai
LLM_MODEL=gpt-4-turbo-preview

Memory System

NovelScribe includes a multi-layered memory system that tracks knowledge graph entities, conversation sessions, and learned patterns:

# View memory system status for a project
scribe memory kb status my-novel

# List knowledge wiki pages
scribe memory kb wiki-list my-novel

# View a wiki page
scribe memory kb wiki-view my-novel character

# Run knowledge base consistency check
scribe memory kb consistency-check my-novel

# Visualize entity relationships as ASCII graph
scribe memory kb graph-visualize my-novel

# View session history
scribe memory kb sessions my-novel

# View learning summaries
scribe memory kb learning my-novel

Components:

  • Knowledge Graph — entity relationships (characters, locations, events) with consistency checking
  • Knowledge Wiki — structured wiki pages with auto-generated content
  • Conversation Memory — append-only JSONL session journal for generation sessions
  • Generation Learning — adaptive learning from generation patterns

API Server (Optional)

For web UI integration or programmatic access:

# If installed with pipx
~/.local/pipx/venvs/novelscribe/bin/uvicorn api.app:app --host 0.0.0.0 --port 8000

# If running from source
cd wst/novel_harness
uv run uvicorn api.app:app --host 0.0.0.0 --port 8000

Development

git clone https://github.com/huangjien/wst.git
cd wst
uv sync --all-extras

# Run quality checks
make check

# Individual
make format      # ruff format
make lint        # ruff check --fix
make type-check  # mypy
make test        # pytest with coverage
make build       # uv build → dist/

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

novelscribe-3.0.1.tar.gz (2.2 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

novelscribe-3.0.1-py3-none-any.whl (2.1 MB view details)

Uploaded Python 3

File details

Details for the file novelscribe-3.0.1.tar.gz.

File metadata

  • Download URL: novelscribe-3.0.1.tar.gz
  • Upload date:
  • Size: 2.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.7 {"installer":{"name":"uv","version":"0.11.7","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for novelscribe-3.0.1.tar.gz
Algorithm Hash digest
SHA256 6a33449a9a691b2150930cac9b3d3ce1df3d1ac8b8440aba8076098d9895f976
MD5 d86ec365e5bc69f601683d868fb954bf
BLAKE2b-256 fdbb27d3a5e35c52ce6c78848e43f3d28f50107693691b4c11ff178f440b6ca9

See more details on using hashes here.

File details

Details for the file novelscribe-3.0.1-py3-none-any.whl.

File metadata

  • Download URL: novelscribe-3.0.1-py3-none-any.whl
  • Upload date:
  • Size: 2.1 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.7 {"installer":{"name":"uv","version":"0.11.7","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for novelscribe-3.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 e0b448893f03ee803021681429baf80e14717cd918eb1071f66e51ccf8b94f19
MD5 55189b839732ca17ab20075a731ae1ea
BLAKE2b-256 1b20f04b276770731d0bab6c5dccf22983bb3a3a0a28fcf2738fddbfb0bd6166

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page