Skip to main content

HTTP API for TON (The Open Network)

Project description

splash_http_api

HTTP API for The Open Network

Since TON nodes uses its own ADNL binary transport protocol, a intermediate service is needed for an HTTP connection.

TON HTTP API is such a intermediate service, receiving requests via HTTP, it accesses the lite servers of the TON network using tonlibjson.

You can use the ready-made toncenter.com service or start your own instance.

Building and running

Recommended hardware:

  • CPU architecture: x86_64 or arm64.
  • HTTP API only: 1 vCPU, 2 GB RAM.
  • HTTP API with cache enabled: 2 vCPUs, 4 GB RAM.

There are two main ways to run TON HTTP API:

  • Local (experimental): works on following platforms: Ubuntu Linux (x86_64, arm64), MacOSX (Intel x86_64, Apple M1 arm64) and Windows (x86_64).
  • Docker Compose: flexible configuration, recommended for production environments, works on any x86_64 and arm64 OS with Docker available.

Local run (experimental)

Note: It is simple but not stable way to run the service. We do not recommend to use it in production.

  • (Windows only, first time) Install OpenSSL v1.1.1 for win64 from here.
  • Install package: pip install ton-http-api.
  • Run service with ton-http-api. This command will run service with mainnet configuration.
    • Run ton-http-api --help to show parameters list.

Docker Compose

  • (First time) Install required tools: docker, docker-compose, curl.
    • For Ubuntu: run scripts/setup.sh from the root of the repo.
    • For MacOS and Windows: install Docker Desktop.
    • Note: we recommend to use Docker Compose V2.
  • Download TON configuration files to private folder:
    mkdir private
    curl -sL https://ton-blockchain.github.io/global.config.json > private/mainnet.json
    curl -sL https://ton-blockchain.github.io/testnet-global.config.json > private/testnet.json
    
  • Run ./configure.py to create .env file with necessary environment variables (see Configuration for details).
  • Build services: docker-compose build.
    • Or pull latest images: docker-compose pull.
  • Run services: docker-compose up -d.
  • Stop services: docker-compose down.

Configuration

You should specify environment parameters and run ./configure.py to create .env file. bash export TON_API_LITESERVER_CONFIG=private/testnet.json ./configure.py

The service supports the following environment variables:

Webserver settings

  • TON_API_HTTP_PORT (default: 80)

    Port for HTTP connections of API service.

  • TON_API_ROOT_PATH (default: /)

    If you use a proxy server such as Nginx or Traefik you might change the default API path prefix (e.g. /api/v2). If so you have to pass the path prefix to the API service in this variable.

  • TON_API_WEBSERVERS_WORKERS (default: 1)

    Number of webserver processes. If your server is under high load try increase this value to increase RPS. We recommend setting it to number of CPU cores / 2.

  • TON_API_GET_METHODS_ENABLED (default: 1)

    Enables runGetMethod endpoint.

  • TON_API_JSON_RPC_ENABLED (default: 1)

    Enables jsonRPC endpoint.

  • TON_API_LOGS_JSONIFY (default: 0)

    Enables printing all logs in json format.

  • TON_API_LOGS_LEVEL (default: ERROR)

    Defines log verbosity level. Values allowed: DEBUG,INFO,WARNING,ERROR,CRITICAL.

  • TON_API_GUNICORN_FLAGS (default: empty)

    Additional Gunicorn command line arguments.

Tonlib settings

  • TON_API_TONLIB_LITESERVER_CONFIG (default docker: private/mainnet.json local: https://ton.org/global-config.json)

    Path to config file with lite servers information. In case of native run you can pass URL to download config. Docker support only path to file.

  • TON_API_TONLIB_KEYSTORE (default docker: /tmp/ton_keystore local: ./ton_keystore/)

    Path to tonlib keystore.

  • TON_API_TONLIB_PARALLEL_REQUESTS_PER_LITESERVER (default: 50)

    Number of maximum parallel requests count per worker.

  • TON_API_TONLIB_CDLL_PATH (default: empty)

    Path to tonlibjson binary. It could be useful if you want to run service on unsupported platform and have built the libtonlibjson library manually.

  • TON_API_TONLIB_REQUEST_TIMEOUT (default: 10)

    Timeout for liteserver requests.

Cache configuration

  • TON_API_CACHE_ENABLED (default: 0)

    Enables caching lite server responses with Redis.

  • TON_API_CACHE_REDIS_ENDPOINT (default: localhost, docker: cache_redis)

    Redis cache service host.

  • TON_API_CACHE_REDIS_PORT (default: 6379)

    Redis cache service port.

  • TON_API_CACHE_REDIS_TIMEOUT (default: 1)

    Redis cache service port.

FAQ

How to point the service to my own lite server?

To point the HTTP API to your own lite server you should set TON_API_TONLIB_LITESERVER_CONFIG to config file with your only lite server.

  • If you use MyTonCtrl on your node you can generate config file with these commands:
    $ mytonctrl
    MyTonCtrl> installer
    MyTonInstaller> clcf
    
    Config file will be saved at /usr/bin/ton/local.config.json.
  • If you don't use MyTonCtrl: copy private/mainnet.json and overwrite section liteservers with your liteservers ip, port and public key. To get public key from liteserver.pub file use the following script:
    python -c 'import codecs; f=open("liteserver.pub", "rb+"); pub=f.read()[4:]; print(str(codecs.encode(pub,"base64")).replace("\n",""))'
    
  • Once config file is created assign variable TON_API_TONLIB_LITESERVER_CONFIG to its path, run ./configure.py and rebuild the project.

How to run multiple API instances on single machine?

  • Clone the repo as many times as many instances you need to the folders with different names (otherwise docker compose containers will conflict).
  • Configure each instance to use unique port (env variable TON_API_HTTP_PORT)
  • Build and run every instance.

How to update tonlibjson library?

Binary file libtonlibjson now moved to pytonlib.

  • Docker Compose: docker-compose build --no-cache.
  • Local run: pip install -U ton-http-api.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ton_http_api-2.0.64.tar.gz (22.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ton_http_api-2.0.64-py3-none-any.whl (22.4 kB view details)

Uploaded Python 3

File details

Details for the file ton_http_api-2.0.64.tar.gz.

File metadata

  • Download URL: ton_http_api-2.0.64.tar.gz
  • Upload date:
  • Size: 22.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for ton_http_api-2.0.64.tar.gz
Algorithm Hash digest
SHA256 9e985b5d3c08f77256c956a78310aeccd25c55f13ae76680777ffb6b8c7e8605
MD5 f0e3f4283877d04e14dae6238a15166e
BLAKE2b-256 9bb36f0a742b3ace101df3f5ec26f4b0f37c44a74bd408f3b424f5dee3feceb1

See more details on using hashes here.

File details

Details for the file ton_http_api-2.0.64-py3-none-any.whl.

File metadata

  • Download URL: ton_http_api-2.0.64-py3-none-any.whl
  • Upload date:
  • Size: 22.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for ton_http_api-2.0.64-py3-none-any.whl
Algorithm Hash digest
SHA256 8867352250b9c9f9f9c8d23225f654d160d249ecf76d8084bd47d05f0ff5fd98
MD5 94bd0b3c73d56c8de03d74b52f4b7c43
BLAKE2b-256 f1f114ceab7c79ca7349ea168105bad5ca2cd7e020293270d7425cb5a4c22f46

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page