⚡ B-FAST (Binary Fast Adaptive Serialization Transfer) - Protocolo de serialização binária de ultra-alta performance, desenvolvido em Rust para o ecossistema Python e TypeScript
Project description
⚡ B-FAST (Binary Fast Adaptive Serialization Transfer)
B-FAST is an ultra-high performance binary serialization protocol, developed in Rust for Python and TypeScript ecosystems. It's designed to replace JSON in critical routes where latency, CPU usage, and bandwidth are bottlenecks.
"Performance is not just about speed—it's about efficiency where it matters most"
B-FAST was born from the recognition that modern applications need more than just fast serialization—they need smart serialization that adapts to real-world constraints. After extensive optimization, B-FAST has found its perfect niche in bandwidth-constrained environments, achieving 1.7x faster than orjson for simple objects and 5.7x faster on slow networks.
Philosophy: We believe that the future of data transfer lies not in raw CPU speed alone, but in intelligent protocols that minimize network overhead while maintaining excellent performance. B-FAST represents our contribution to a more efficient, bandwidth-conscious web.
📚 Documentation
Full documentation available at: https://marcelomarkus.github.io/b-fast/
🚀 Why B-FAST?
- Rust Engine: Native serialization without Python interpreter overhead.
- Pydantic Native: Reads Pydantic model attributes directly from memory, skipping the slow .model_dump() process.
- Zero-Copy NumPy: Serializes tensors and numeric arrays directly, achieving 14-96x speedup vs JSON/orjson.
- Parallel Compression: LZ4 with multi-thread processing for large payloads (>1MB).
- Cache Optimized: Aligned allocation and batch processing for maximum efficiency.
📊 Benchmarks (Updated Results)
🚀 Simple Objects (10,000)
| Format | Time (ms) | Speedup |
|---|---|---|
| JSON | 12.0ms | 1.0x |
| orjson | 8.19ms | 1.5x |
| B-FAST | 4.83ms | 🚀 2.5x |
B-FAST is 1.7x faster than orjson!
🔄 Round-Trip (Encode + Network + Decode)
Complete test including network transfer and deserialization (10,000 objects):
📡 100 Mbps (Slow Network)
| Format | Total Time | Speedup vs orjson |
|---|---|---|
| JSON | 114.5ms | 0.8x |
| orjson | 91.7ms | 1.0x |
| B-FAST + LZ4 | 16.1ms | 🚀 5.7x |
📡 1 Gbps (Fast Network)
| Format | Total Time | Speedup vs orjson |
|---|---|---|
| JSON | 29.4ms | 0.5x |
| orjson | 15.3ms | 1.0x |
| B-FAST + LZ4 | 7.2ms | 🚀 2.1x |
📡 10 Gbps (Ultra-Fast Network)
| Format | Total Time | Speedup vs orjson |
|---|---|---|
| JSON | 20.9ms | 0.4x |
| orjson | 7.7ms | 1.0x |
| B-FAST + LZ4 | 6.3ms | 🚀 1.2x |
🎯 Ideal Use Cases
- 📱 Mobile/IoT: 89% data savings + 5.7x performance on slow networks
- 🌐 APIs with slow networks: Up to 5.7x faster than orjson
- 📊 Data pipelines: 14-96x speedup for NumPy arrays
- 🗜️ Storage/Cache: Superior integrated compression
- 🚀 Simple objects: 1.7x faster than orjson
📦 Installation
Backend (Python)
# Basic installation
pip install bfast-py
# With FastAPI support
pip install "bfast-py[fastapi]"
or with uv:
uv add bfast-py
# or
uv add "bfast-py[fastapi]"
Frontend (TypeScript)
npm install bfast-client
🛠️ How to Use
Backend (Python)
1. FastAPI (Direct Integration) ⭐ Recommended
B-FAST includes a built-in BFastResponse for seamless integration.
from fastapi import FastAPI
from pydantic import BaseModel
from b_fast import BFastResponse
app = FastAPI()
class User(BaseModel):
id: int
name: str
@app.get("/users", response_class=BFastResponse)
async def get_users():
# Returns binary B-FAST data with automatic LZ4 compression
return [User(id=i, name=f"User {i}") for i in range(1000)]
2. Flask
from flask import Flask, Response
import b_fast
app = Flask(__name__)
encoder = b_fast.BFast()
@app.route('/users')
def get_users():
users = [{"id": i, "name": f"User {i}"} for i in range(1000)]
data = encoder.encode_packed(users, compress=True)
return Response(data, mimetype='application/octet-stream')
3. Django
from django.http import HttpResponse
import b_fast
encoder = b_fast.BFast()
def get_users(request):
users = [{"id": i, "name": f"User {i}"} for i in range(1000)]
data = encoder.encode_packed(users, compress=True)
return HttpResponse(data, content_type='application/octet-stream')
4. Any Python Framework
import b_fast
encoder = b_fast.BFast()
# Encode your data
data = encoder.encode_packed(your_data, compress=True)
# Return as bytes (binary response)
Frontend (TypeScript)
import { BFastDecoder } from 'bfast-client';
async function loadData() {
const response = await fetch('/users');
const buffer = await response.arrayBuffer();
// Decodes and decompresses LZ4 automatically
const users = BFastDecoder.decode(buffer);
console.log(users);
}
About B-FAST
Key Achievements:
- 🚀 1.7x faster than orjson for simple objects
- 🚀 5.7x faster than orjson on 100 Mbps networks (round-trip)
- 📦 89% smaller payloads with built-in LZ4 compression
- ⚡ 14-96x speedup for NumPy arrays
- 🎯 Competitive even on ultra-fast 10 Gbps networks
B-FAST performance comparison across different scenarios: simple objects, large objects on 100 Mbps network, NumPy arrays, and payload size. B-FAST demonstrates superiority in speed (1.7-14x faster) and bandwidth efficiency (90% reduction with LZ4).
Developed by: marcelomarkus
📄 License
Distributed under the MIT License. See LICENSE for more information.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distributions
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file bfast_py-1.2.1-cp38-abi3-win_amd64.whl.
File metadata
- Download URL: bfast_py-1.2.1-cp38-abi3-win_amd64.whl
- Upload date:
- Size: 4.2 MB
- Tags: CPython 3.8+, Windows x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.20
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
07a0c17cd624c19e4aadfc47b56c60b7527d39a99cb4cf9aef0993bf2a2f3c44
|
|
| MD5 |
b595fee52cbc807672985f30a6f5b8db
|
|
| BLAKE2b-256 |
3082a42d28cb954b102bbcece00df8c6bed1c73cf72417e8c270146d863e1c23
|
File details
Details for the file bfast_py-1.2.1-cp38-abi3-manylinux_2_34_x86_64.whl.
File metadata
- Download URL: bfast_py-1.2.1-cp38-abi3-manylinux_2_34_x86_64.whl
- Upload date:
- Size: 4.4 MB
- Tags: CPython 3.8+, manylinux: glibc 2.34+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.20
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c9ca1ab47b89ad59086e1761c1e76bc3a76a4f5de51c95896cc8f0586cdbf61d
|
|
| MD5 |
b725a9a5496f1c559f181e3aa8f57513
|
|
| BLAKE2b-256 |
9e9b953446b9399f50f31bc402600ee3ae572193ad425496d9a92c9cbf6f460a
|
File details
Details for the file bfast_py-1.2.1-cp38-abi3-macosx_11_0_arm64.whl.
File metadata
- Download URL: bfast_py-1.2.1-cp38-abi3-macosx_11_0_arm64.whl
- Upload date:
- Size: 4.3 MB
- Tags: CPython 3.8+, macOS 11.0+ ARM64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.20
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
bfe668028880e88bf845e336404ba749d83ae586652fb8dfc23a1dd649b728d1
|
|
| MD5 |
2d635f614ec76a09ecb288221e09ffd8
|
|
| BLAKE2b-256 |
d22826c4ff52fc8e76b56c08c1a63b21ceed90b58fd7a41f50339dff0d44c3cb
|