Skip to main content

Newbro CLI for the Synapse communication-brain / execution-brain runtime prototype

Project description

Synapse

Backend-first prototype for a communication-brain / execution-brain runtime.

Concept

  • Communication Brain: handles acknowledgement, clarification, and user-facing status.
  • Execution Brain: owns task lifecycle and executor orchestration.
  • Shared Blackboard: the session-level state synchronization layer.
  • Protocols: explicit schemas for messages, tasks, execution events, and stream events.

CLI

Synapse requires Python 3.12 or newer.

For a fresh clone, use the repo bootstrap launcher:

./install.sh
./newbro setup
./newbro connector setup
./newbro doctor
./newbro dev

./install.sh installs supported local development dependencies, creates .venv, installs the project in editable mode, installs frontend dependencies, and writes starter ~/.newbro/.env plus ~/.newbro/config.yaml files when they do not already exist.

./newbro setup fills in ~/.newbro/.env plus the shared ~/.newbro/config.yaml runtime/api/connectors config. By default it prompts for required runtime values such as OPENAI_API_KEY, and it can also enter the connector-host setup flow. For connector-only reconfiguration, use:

./newbro connector setup

For automation, use:

OPENAI_API_KEY=... ./newbro setup --non-interactive

If you already have legacy Synapse config under ~/.synapse and ~/.newbro does not exist yet, the CLI migrates that home directory to ~/.newbro on the first run.

Install From PyPI

Install the public package with:

python3 -m pip install newbro-cli
newbro --help
newbro executor setup
newbro executor run --base-url https://synapse.example.com --node-id node-1234 --token secret

The published package name is newbro-cli and the installed console script is newbro, but the Python module namespace is still synapse in this release. Use import synapse for Python imports; import newbro is not supported.

~/.newbro/.env is auto-loaded by the backend at startup. You do not need to export variables manually. OpenAI is required for normal development and demo runtime, so set OPENAI_API_KEY in ~/.newbro/.env before starting the app.

Optional ACPX Executor

If you want Synapse to delegate execution through acpx instead of the direct Codex executor, install ACPX first:

npm install -g acpx@latest

Quick verification:

acpx --version
codex --version

Then add at least this to ~/.newbro/.env:

SYNAPSE_ACPX_EXECUTOR_ENABLED=true

Optional overrides:

# SYNAPSE_ACPX_COMMAND=acpx
# SYNAPSE_ACPX_AGENT=codex
# SYNAPSE_ACPX_PERMISSION_MODE=approve-all
# SYNAPSE_ACPX_NON_INTERACTIVE_PERMISSIONS=deny
# SYNAPSE_ACPX_TIMEOUT_SECONDS=300

If both ACPX and the direct Codex executor are enabled, Synapse prefers ACPX.

Common Commands

./install.sh
./newbro setup
./newbro connector setup
./newbro doctor
./newbro dev
./newbro backend
./newbro frontend
./newbro start
./newbro connector run
./newbro service install
./newbro service start
./newbro service stop
./newbro service restart

The installed console script is named newbro, so after setup you can run .venv/bin/newbro dev or activate the virtual environment and use newbro dev.

Run Backend

./newbro backend

FastAPI docs will be available at:

http://127.0.0.1:8000/docs

If the frontend shows error and messages do not progress, first confirm the backend is running from the same virtual environment where dependencies were installed.

To run only the frontend:

./newbro frontend

To run only the headless connector host:

./newbro connector run

Ubuntu Systemd

For an Ubuntu server deployment from a repo checkout, install the combined system service with:

./newbro service install

./newbro service install now installs or updates the unit, reloads systemd, enables the unit, and restarts the service so the latest code is live immediately.

The installed newbro.service unit runs newbro start, so it serves one main Synapse service on the public port.

This path stays inside the repo checkout. The main service serves src/synapse/ui/dist at /, keeps the normal API and websocket routes on the same origin, and mounts /api/connectors/... routes directly when connectors are enabled.

The systemd unit runs as the user who invoked ./newbro service install and reads shared runtime-plus-connector config from that user’s home directory:

~/.newbro/.env
~/.newbro/config.yaml

If you install the service as root, it will run as root and use:

/root/.newbro/.env
/root/.newbro/config.yaml

If the Codex executor is enabled, set an absolute runtime.codex_command in ~/.newbro/config.yaml so the service does not depend on an interactive shell PATH.

./newbro dev and ./newbro start do not auto-start the standalone connector host. Run ./newbro connector run separately when you want the detached connector process for direct connector testing or separate deployment.

./newbro dev is the reload-capable local iteration path. ./newbro start does not reload Python code changes, so restart it after editing backend, connector modules, or other Python service code.

The connector host talks to the Synapse backend directly using the configured SYNAPSE_CONNECTOR_SYNAPSE_BASE_URL and does not use proxy environment variables for its internal upstream traffic.

Test

.venv/bin/python -m pytest

Frontend build check:

cd src/synapse/ui
npm run build

Release build and publish:

python3 -m pip install '.[release]'
python3 -m build
python3 -m twine check dist/*
python3 -m twine upload dist/*

Or use the helper script:

PYPI_TOKEN='pypi-...' ./scripts/publish_pypi.sh
PYPI_TOKEN='pypi-...' ./scripts/publish_pypi.sh --testpypi
./scripts/publish_pypi.sh --dry-run

Deploy UI to Vercel

The main UI lives under src/synapse/ui/.

Before deploying the frontend separately, make sure the backend is reachable on its own public HTTPS origin, that the public backend origin preserves secure websocket upgrades for WS /api/sessions/{session_id}/stream, and that the backend allows the Vercel frontend origin through CORS:

SYNAPSE_CORS_ALLOWED_ORIGINS=https://app.example.com,https://your-project.vercel.app

Then deploy from the UI workspace:

cd src/synapse/ui
npx vercel env add VITE_API_BASE_URL production
npx vercel env add VITE_CONNECTOR_BASE_URL production
npx vercel --prod

Set the production frontend base URLs to your public server origin, for example:

VITE_API_BASE_URL=https://newbro.plutoless.com
VITE_CONNECTOR_BASE_URL=https://newbro.plutoless.com

If you also use Vercel preview deployments, add the same variable for the preview environment and include that preview origin in SYNAPSE_CORS_ALLOWED_ORIGINS.

If the deployed UI enables voice mode, the connector host must also allow the frontend origin. Configure that in ~/.newbro/config.yaml under:

connector_host:
  cors_allowed_origins:
    - https://newbro.agora-io.czhen.work

If the backend is served behind Nginx on your server, proxy the public session routes to the main Synapse API on port 8000, proxy /api/connectors/... to the connector host on 8010, and keep websocket upgrade headers intact for /api/sessions/{session_id}/stream. See docs/guides/vercel-ui-deployment.md for the full deployment contract and an example reverse-proxy shape.

This repo also includes a GitHub Actions workflow at .github/workflows/deploy-ui-vercel.yml:

  • pull requests deploy a Vercel preview for src/synapse/ui
  • pushes to main deploy production
  • workflow_dispatch can trigger a manual production deploy

Before enabling that workflow, configure these GitHub repository settings:

  • Actions secret: VERCEL_TOKEN
  • Actions variable or secret: VERCEL_ORG_ID
  • Actions variable or secret: VERCEL_PROJECT_ID

The production GitHub Actions deploy now injects VITE_API_BASE_URL=https://newbro.plutoless.com and VITE_CONNECTOR_BASE_URL=https://newbro.plutoless.com directly into the build so the merge-to-main path does not depend on separate Vercel production env entries. If you also use manual Vercel CLI deploys outside GitHub Actions, keep the Vercel project env aligned with those same values.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

newbro_cli-0.1.1.tar.gz (150.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

newbro_cli-0.1.1-py3-none-any.whl (212.2 kB view details)

Uploaded Python 3

File details

Details for the file newbro_cli-0.1.1.tar.gz.

File metadata

  • Download URL: newbro_cli-0.1.1.tar.gz
  • Upload date:
  • Size: 150.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.3

File hashes

Hashes for newbro_cli-0.1.1.tar.gz
Algorithm Hash digest
SHA256 7c958ff95e660e7d76e8260b53930d397b831077fe599dfae98d9dc949c179b4
MD5 654eedebe05a8773780cba894b5453ab
BLAKE2b-256 0f0c7b12d5d08825cda7ed357fdbc76eb2bf8b85bfb9c3f2b73a0c3078fbd368

See more details on using hashes here.

File details

Details for the file newbro_cli-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: newbro_cli-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 212.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.3

File hashes

Hashes for newbro_cli-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 b490b51b9d45d9f3210aa18da8b039e7bc4eb68ba49d6d736d827e33ea7977bd
MD5 4683b11196fc988807927448841cb363
BLAKE2b-256 9beddd1fc315d2f824babb5e9b03adee20c8b40861c09b0de51428228ddbee0f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page