Deep-Learning for Optimal VWAP Execution
Project description
Recurrent Neural Networks for Dynamic VWAP Execution: An Adaptive Trading Strategy with Temporal Kolmogorov-Arnold Networks
This repository presents Aplo's latest research on dynamic VWAP execution and contains the code discussed in the paper Recurrent Neural Networks for Dynamic VWAP Execution: An Adaptive Trading Strategy with Temporal Kolmogorov-Arnold Networks.
Note: This version includes only the TKAN-based dynamic VWAP model for simplicity in serialization. An LSTM-based version can be implemented by modifying the code accordingly.
Model Overview
The dynamic VWAP model is implemented as a Keras model compatible with any backend (TensorFlow, JAX, or PyTorch). We recommend using JAX for optimal performance.
Installation
-
Clone the repository:
git clone <repository-url> cd <repository-directory>
-
Install the package:
pip install .
Alternatively, using Poetry:
poetry install
Usage
Below is a minimal example demonstrating how to use the dynamic VWAP model:
from dynamic_vwap import DynamicVWAP, quadratic_vwap_loss, absolute_vwap_loss, volume_curve_loss
from dynamic_vwap.data_formater import full_generate
import pandas as pd
# Parameters
lookback = 120 # Number of past time steps used as input
n_ahead = 12 # Number of future time steps to predict
target_asset = 'AAPL'
BATCH_SIZE = 128
N_MAX_EPOCHS = 1000
# Load your data (here using Parquet files)
volumes = pd.read_parquet('path_to_your_volume_data.parquet')
notionals = pd.read_parquet('path_to_your_notionals_data.parquet')
# Generate training and testing datasets
X_train, X_test, y_train, y_test = full_generate(
volumes,
notionals,
target_asset,
lookback=lookback,
n_ahead=n_ahead,
test_split=0.2,
autoscale_target=True
)
# Initialize the dynamic VWAP model
model = DynamicVWAP(
lookback=lookback,
n_ahead=n_ahead,
hidden_size=100,
hidden_rnn_layer=2
)
# Compile the model with a VWAP-specific loss function
model.compile(optimizer='adam', loss=quadratic_vwap_loss)
# Train the model
history = model.fit(
X_train, y_train,
batch_size=BATCH_SIZE,
epochs=N_MAX_EPOCHS,
validation_split=0.2,
shuffle=True,
verbose=False
)
# Make predictions
predictions = model.predict(X_test, verbose=False)
Model Parameters
- lookback: Number of past time steps used as input.
- n_ahead: Number of future time steps for which the dynamic volume curve is predicted.
- hidden_size: Number of units in the hidden layers of the internal RNN.
- hidden_rnn_layer: Number of TKAN layers in the internal RNN.
Note: The input data matrix must have a sequence length of lookback + n_ahead - 1 time steps. This format ensures that the model receives the necessary ahead inputs during training. For real-time applications, appropriate padding may be needed.
Loss Functions
The package provides the following loss functions to effectively minimize the deviation between the achieved VWAP and the market VWAP:
quadratic_vwap_lossabsolute_vwap_lossvolume_curve_loss
Data Formatting
The model expects inputs in matrix format rather than a dictionary. The expected shapes are:
- Features Input: A NumPy array of shape
(num_samples, lookback + n_ahead - 1, num_features). - Targets: A NumPy array of shape
(num_samples, n_ahead, 2), where the first element (along the last dimension) corresponds to volume allocations and the second corresponds to prices.
The provided helper function full_generate in data_formater.py facilitates the creation of training and testing datasets. For example:
import pandas as pd
from dynamic_vwap.data_formater import full_generate
volumes = pd.read_parquet('path_to_your_volume_data.parquet')
notionals = pd.read_parquet('path_to_your_notionals_data.parquet')
X_train, X_test, y_train, y_test = full_generate(
volumes,
notionals,
target_asset='AAPL',
lookback=120,
n_ahead=12,
test_split=0.2,
autoscale_target=True
)
Example and Results
For detailed examples, including reproduction of experimental results and graphs from the paper, please refer to the example_and_results folder.
License
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file dynamic_vwap-0.1.0.tar.gz.
File metadata
- Download URL: dynamic_vwap-0.1.0.tar.gz
- Upload date:
- Size: 5.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.7.1 CPython/3.12.2 Linux/6.8.0-52-generic
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6dafc314cdf280b3c968692614583baa3a969602f37353236d803e6966d245f5
|
|
| MD5 |
bfd981b47492bc3aed1bc199f3c6a3ff
|
|
| BLAKE2b-256 |
6431ad5b56d37477be2f664d92c0d676799e4bb0488ad7988d7e3253d4d0c221
|
File details
Details for the file dynamic_vwap-0.1.0-py3-none-any.whl.
File metadata
- Download URL: dynamic_vwap-0.1.0-py3-none-any.whl
- Upload date:
- Size: 6.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.7.1 CPython/3.12.2 Linux/6.8.0-52-generic
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
20853f25e737f66c8f01aa3320fc1b349e8f56ef89ff5fb8faff931ea975b0db
|
|
| MD5 |
df410e8aa011db0d47c537332292d512
|
|
| BLAKE2b-256 |
0798d6f232999056dc0f40e5ab5a6d46ae75b66642b243d8b6f891e112b99bc6
|