Skip to main content

A library to sample temporal random walks from in-memory temporal graphs

Project description

🚀 Temporal Random Walk

Build Passing PyPI Latest Release PyPI Downloads

A high-performance temporal random walk sampler for dynamic networks with GPU acceleration. Built for scale.


🔥 Why Temporal Random Walk?

Performance First – GPU-accelerated sampling for massive networks
Memory Efficient – Smart memory management for large graphs
Flexible Integration – Easy Python bindings with NumPy/NetworkX support
Production Ready – Tested with hundreds of extensive unit tests.
Multi Platform Builds and runs seamlessly on devices with or without CUDA.


⚡ Quick Start

from temporal_random_walk import TemporalRandomWalk

# Create a directed temporal graph
walker = TemporalRandomWalk(is_directed=True, use_gpu=True, max_time_capacity=-1)

# Add edges - can be numpy arrays or python lists
sources = [3, 2, 0, 3, 3, 1]
targets = [4, 4, 2, 1, 2, 4]
timestamps = [71, 82, 19, 34, 79, 19]

walker.add_multiple_edges(sources, targets, timestamps)

# Sample walks with exponential time bias
walk_nodes, walk_timestamps, walk_lens, edge_features = walker.get_random_walks_and_times_for_all_nodes(
    max_walk_len=5,
    walk_bias="ExponentialIndex",
    num_walks_per_node=10,
    initial_edge_bias="Uniform"
)
# edge_features is None when no edge features were added (feature_dim=0)

✨ Key Features

  • GPU acceleration for large graphs
  • 🎯 Multiple sampling strategies – Uniform, Linear, Exponential
  • 🧠 Advanced temporal biases – ExponentialWeight (CTDNE-style) NeurTWs (SpatioTemporal bias) and TemporalNode2Vec
  • 🔄 Forward & backward temporal walks
  • 📡 Rolling window support for streaming data
  • 🏷️ Optional edge feature propagation from input edges to sampled walks
  • 🔗 NetworkX integration
  • 🛠️ Efficient memory management
  • ⚙️ Uses C++ std libraries or Thrust API selectively based on hardware availability and configuration.

🏷️ Edge Features (Optional)

If your edges carry attributes (weights, embeddings, types, etc.), you can pass them to add_multiple_edges(...) and receive aligned edge features for each sampled transition.

import numpy as np
from temporal_random_walk import TemporalRandomWalk

walker = TemporalRandomWalk(is_directed=True, use_gpu=False)

sources = np.array([0, 0, 1], dtype=np.int32)
targets = np.array([1, 2, 2], dtype=np.int32)
timestamps = np.array([10, 20, 30], dtype=np.int64)

# shape: [num_edges, feature_dim]
edge_features = np.array([
    [0.1, 1.0],
    [0.2, 0.5],
    [0.9, 0.3],
], dtype=np.float32)

walker.add_multiple_edges(sources, targets, timestamps, edge_features=edge_features)

walk_nodes, walk_timestamps, walk_lens, walk_edge_features = walker.get_random_walks_and_times(
    max_walk_len=4,
    walk_bias="Uniform",
    num_walks_total=5,
)

# walk_edge_features.shape == [num_walks, max_walk_len - 1, feature_dim]

walk_edge_features is None when no edge features are provided.

🏷️ Node Features

The library can also store dense node features. Use set_node_features(node_ids, node_features) to populate features for specific nodes, then get_node_features() to retrieve the dense matrix.


🧭 Bias Selection Notes

  • Use ExponentialIndex or Linear for recency-aware sampling with no extra setup.
  • Use ExponentialWeight when you want CTDNE-style weight computation (enable_weight_computation=True, optionally tune timescale_bound).
  • Use SpatioTemporal for NeurTWs-style sampling that combines three dynamic biases during sampling: temporal recency, spatial preference for nodes with lower temporal degree, and an exploration penalty that discourages revisiting nodes.
  • Use TemporalNode2Vec when you need return/in-out control via temporal_node2vec_p and temporal_node2vec_q.

📦 Dependencies

Dependency Purpose
pybind11 Python-C++ bindings
python3 Required for building the python interfaces
gtest Unit testing framework

💡 Tip: Use vcpkg to easily install and link the C++ dependencies.


📦 Installation

pip install temporal-random-walk

📖 Documentation

📌 C++ Documentation →
📌 Python Interface Documentation →


📚 Inspired By

Nguyen, Giang Hoang, et al.
"Continuous-Time Dynamic Network Embeddings."
Companion Proceedings of The Web Conference 2018.

👨‍🔬 Built by Packets Research Lab

🚀 Contributions welcome! Open a PR or issue if you have suggestions.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

temporal_random_walk-1.6.2-cp311-cp311-manylinux_2_34_x86_64.whl (78.3 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.34+ x86-64

temporal_random_walk-1.6.2-cp310-cp310-manylinux_2_34_x86_64.whl (78.3 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.34+ x86-64

temporal_random_walk-1.6.2-cp39-cp39-manylinux_2_34_x86_64.whl (78.3 MB view details)

Uploaded CPython 3.9manylinux: glibc 2.34+ x86-64

temporal_random_walk-1.6.2-cp38-cp38-manylinux_2_34_x86_64.whl (78.3 MB view details)

Uploaded CPython 3.8manylinux: glibc 2.34+ x86-64

File details

Details for the file temporal_random_walk-1.6.2-cp311-cp311-manylinux_2_34_x86_64.whl.

File metadata

File hashes

Hashes for temporal_random_walk-1.6.2-cp311-cp311-manylinux_2_34_x86_64.whl
Algorithm Hash digest
SHA256 eb59e018e92cfc28d5fe1c8866838a854c325729afaae72926ebba67c6c18927
MD5 1c3b2805893d2af46f22d5ebbf519470
BLAKE2b-256 4c3ef3a89157e4a8011c1ab1f8b0042c40b78d6676ebda48916589ebc7e16ac7

See more details on using hashes here.

File details

Details for the file temporal_random_walk-1.6.2-cp310-cp310-manylinux_2_34_x86_64.whl.

File metadata

File hashes

Hashes for temporal_random_walk-1.6.2-cp310-cp310-manylinux_2_34_x86_64.whl
Algorithm Hash digest
SHA256 0cc2c71dc704737fbc305f83edc4e7a00cfa164700cc173d1be8d82e4bd311d1
MD5 798b60f644eef8f7c2ce2f25ff6e5689
BLAKE2b-256 c2125b3173070b165ab43457f405188072cf0a2e4d3bc569d3d055a52aa78ca0

See more details on using hashes here.

File details

Details for the file temporal_random_walk-1.6.2-cp39-cp39-manylinux_2_34_x86_64.whl.

File metadata

File hashes

Hashes for temporal_random_walk-1.6.2-cp39-cp39-manylinux_2_34_x86_64.whl
Algorithm Hash digest
SHA256 7a05a713ee42d181b039391f8a66cd920a0af854f8f100235412c898ce45e7e7
MD5 033134fbff9bc9bf1e0782aecaa04509
BLAKE2b-256 4ded74c9aca0c8b9bebadf85e9752a9bd5052ab3b054bc08f04a4a34f6d51cee

See more details on using hashes here.

File details

Details for the file temporal_random_walk-1.6.2-cp38-cp38-manylinux_2_34_x86_64.whl.

File metadata

File hashes

Hashes for temporal_random_walk-1.6.2-cp38-cp38-manylinux_2_34_x86_64.whl
Algorithm Hash digest
SHA256 ca0b77669f5f955472841deb39117d64e4c49b1b1de72aac140c8bdb7e8151c7
MD5 8d4b2dff3fecd323544bd3a1c9540d9a
BLAKE2b-256 9b4ba6b1a3d0c5186c432d20fd9b55ace1b53ebce6b705385bb633ebcb5464b3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page