Skip to main content

HTTP API for TON (The Open Network)

Project description

splash_http_api

HTTP API for The Open Network

Since TON nodes uses its own ADNL binary transport protocol, a intermediate service is needed for an HTTP connection.

TON HTTP API is such a intermediate service, receiving requests via HTTP, it accesses the lite servers of the TON network using tonlibjson.

You can use the ready-made toncenter.com service or start your own instance.

Building and running

Recommended hardware:

  • CPU architecture: x86_64 or arm64.
  • HTTP API only: 1 vCPU, 2 GB RAM.
  • HTTP API with cache enabled: 2 vCPUs, 4 GB RAM.

There are two main ways to run TON HTTP API:

  • Local (experimental): works on following platforms: Ubuntu Linux (x86_64, arm64), MacOSX (Intel x86_64, Apple M1 arm64) and Windows (x86_64).
  • Docker Compose: flexible configuration, recommended for production environments, works on any x86_64 and arm64 OS with Docker available.

Local run (experimental)

Note: It is simple but not stable way to run the service. We do not recommend to use it in production.

  • (Windows only, first time) Install OpenSSL v1.1.1 for win64 from here.
  • Install package: pip install ton-http-api.
  • Run service with ton-http-api. This command will run service with mainnet configuration.
    • Run ton-http-api --help to show parameters list.

Docker Compose

  • (First time) Install required tools: docker, docker-compose, curl.
    • For Ubuntu: run scripts/setup.sh from the root of the repo.
    • For MacOS and Windows: install Docker Desktop.
    • Note: we recommend to use Docker Compose V2.
  • Download TON configuration files to private folder:
    mkdir private
    curl -sL https://ton-blockchain.github.io/global.config.json > private/mainnet.json
    curl -sL https://ton-blockchain.github.io/testnet-global.config.json > private/testnet.json
    
  • Run ./configure.py to create .env file with necessary environment variables (see Configuration for details).
  • Build services: docker-compose build.
    • Or pull latest images: docker-compose pull.
  • Run services: docker-compose up -d.
  • Stop services: docker-compose down.

Configuration

You should specify environment parameters and run ./configure.py to create .env file. bash export TON_API_LITESERVER_CONFIG=private/testnet.json ./configure.py

The service supports the following environment variables:

Webserver settings

  • TON_API_HTTP_PORT (default: 80)

    Port for HTTP connections of API service.

  • TON_API_ROOT_PATH (default: /)

    If you use a proxy server such as Nginx or Traefik you might change the default API path prefix (e.g. /api/v2). If so you have to pass the path prefix to the API service in this variable.

  • TON_API_WEBSERVERS_WORKERS (default: 1)

    Number of webserver processes. If your server is under high load try increase this value to increase RPS. We recommend setting it to number of CPU cores / 2.

  • TON_API_GET_METHODS_ENABLED (default: 1)

    Enables runGetMethod endpoint.

  • TON_API_JSON_RPC_ENABLED (default: 1)

    Enables jsonRPC endpoint.

  • TON_API_LOGS_JSONIFY (default: 0)

    Enables printing all logs in json format.

  • TON_API_LOGS_LEVEL (default: ERROR)

    Defines log verbosity level. Values allowed: DEBUG,INFO,WARNING,ERROR,CRITICAL.

  • TON_API_GUNICORN_FLAGS (default: empty)

    Additional Gunicorn command line arguments.

Tonlib settings

  • TON_API_TONLIB_LITESERVER_CONFIG (default docker: private/mainnet.json local: https://ton.org/global-config.json)

    Path to config file with lite servers information. In case of native run you can pass URL to download config. Docker support only path to file.

  • TON_API_TONLIB_KEYSTORE (default docker: /tmp/ton_keystore local: ./ton_keystore/)

    Path to tonlib keystore.

  • TON_API_TONLIB_PARALLEL_REQUESTS_PER_LITESERVER (default: 50)

    Number of maximum parallel requests count per worker.

  • TON_API_TONLIB_CDLL_PATH (default: empty)

    Path to tonlibjson binary. It could be useful if you want to run service on unsupported platform and have built the libtonlibjson library manually.

  • TON_API_TONLIB_REQUEST_TIMEOUT (default: 10)

    Timeout for liteserver requests.

Cache configuration

  • TON_API_CACHE_ENABLED (default: 0)

    Enables caching lite server responses with Redis.

  • TON_API_CACHE_REDIS_ENDPOINT (default: localhost, docker: cache_redis)

    Redis cache service host.

  • TON_API_CACHE_REDIS_PORT (default: 6379)

    Redis cache service port.

  • TON_API_CACHE_REDIS_TIMEOUT (default: 1)

    Redis cache service port.

FAQ

How to point the service to my own lite server?

To point the HTTP API to your own lite server you should set TON_API_TONLIB_LITESERVER_CONFIG to config file with your only lite server.

  • If you use MyTonCtrl on your node you can generate config file with these commands:
    $ mytonctrl
    MyTonCtrl> installer
    MyTonInstaller> clcf
    
    Config file will be saved at /usr/bin/ton/local.config.json.
  • If you don't use MyTonCtrl: copy private/mainnet.json and overwrite section liteservers with your liteservers ip, port and public key. To get public key from liteserver.pub file use the following script:
    python -c 'import codecs; f=open("liteserver.pub", "rb+"); pub=f.read()[4:]; print(str(codecs.encode(pub,"base64")).replace("\n",""))'
    
  • Once config file is created assign variable TON_API_TONLIB_LITESERVER_CONFIG to its path, run ./configure.py and rebuild the project.

How to run multiple API instances on single machine?

  • Clone the repo as many times as many instances you need to the folders with different names (otherwise docker compose containers will conflict).
  • Configure each instance to use unique port (env variable TON_API_HTTP_PORT)
  • Build and run every instance.

How to update tonlibjson library?

Binary file libtonlibjson now moved to pytonlib.

  • Docker Compose: docker-compose build --no-cache.
  • Local run: pip install -U ton-http-api.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ton-http-api-2.0.22.tar.gz (22.2 kB view details)

Uploaded Source

Built Distribution

ton_http_api-2.0.22-py3-none-any.whl (21.8 kB view details)

Uploaded Python 3

File details

Details for the file ton-http-api-2.0.22.tar.gz.

File metadata

  • Download URL: ton-http-api-2.0.22.tar.gz
  • Upload date:
  • Size: 22.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.14

File hashes

Hashes for ton-http-api-2.0.22.tar.gz
Algorithm Hash digest
SHA256 d51232f1de85e36d963ca452c1420199ec0f92a1eceff2566108d584d3795b41
MD5 06d689325fb639cd14ec3044db9cb15a
BLAKE2b-256 4ca23fa6937d81e926f62903c07f737c885c704be947755c7e812295f004c916

See more details on using hashes here.

File details

Details for the file ton_http_api-2.0.22-py3-none-any.whl.

File metadata

File hashes

Hashes for ton_http_api-2.0.22-py3-none-any.whl
Algorithm Hash digest
SHA256 ad308531f47a874911200c1bee17a60625cfbf5d7788485fa229c0bca7d59424
MD5 bd45b5a4040a85a4c1ecbd9c334ecb1b
BLAKE2b-256 1197de12b83d513b5c4076d2b07f767eb4f8ff5ab0d14b06bb0a04fa4091c024

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page