Skip to main content

A Python package for generating synthetic river networks and datasets

Project description

FluvialGen

A Python package for generating synthetic river networks and datasets.

Installation

You can install FluvialGen using pip:

pip install fluvialgen

Or install from source:

git clone https://github.com/joseenriqueruiznavarro/FluvialGen.git
cd FluvialGen
pip install -e .

Requirements

  • Python >= 3.8
  • NumPy
  • Pandas
  • SciPy
  • Matplotlib
  • GeoPandas
  • Shapely
  • Rasterio
  • tqdm

Integration with River Models

MovingWindowBatcher

This class provides a way to process data in overlapping windows with batching support:

from river import compose, linear_model, preprocessing, optim, metrics
from fluvialgen.movingwindow_generator import MovingWindowBatcher
from river import datasets

# Create a River pipeline
model = compose.Select('clouds', 'humidity', 'pressure', 'temperature', 'wind')
model |= preprocessing.StandardScaler()
model |= linear_model.LinearRegression(optimizer=optim.SGD(0.001))

# Initialize metrics
metric = metrics.MAE()

# Create the dataset and batcher
dataset = datasets.Bikes()
batcher = MovingWindowBatcher(
    dataset=dataset,
    instance_size=2,  # Size of each window
    batch_size=2,     # Number of instances per batch
    n_instances=1000
)

# Train the model
try:
    # Process batches and train the model
    for X, y in batcher:
        # Train on each instance in the batch
        for i in range(len(X)):
            x = X.iloc[i]
            target = y.iloc[i]
            model.learn_one(x, target)
            
        # Make predictions and update metrics
        for i in range(len(X)):
            x = X.iloc[i]
            target = y.iloc[i]
            y_pred = model.predict_one(x)
            metric.update(target, y_pred)
            
    print(f"Final MAE: {metric}")

finally:
    # Clean up
    batcher.stop()

PastForecastBatcher

This class provides a way to process data with past data and forecast values:

from river import compose, linear_model, preprocessing, optim, metrics
from fluvialgen.past_forecast_batcher import PastForecastBatcher
from river import datasets

# Create a River pipeline
model = compose.Select('clouds', 'humidity', 'pressure', 'temperature', 'wind')
model |= preprocessing.StandardScaler()
model |= linear_model.LinearRegression(optimizer=optim.SGD(0.001))

# Initialize metrics
metric = metrics.MAE()

# Create the dataset and batcher
dataset = datasets.Bikes()
batcher = PastForecastBatcher(
    dataset=dataset,
    past_size=3,      # Number of past instances to include
    forecast_size=1,  # Use data 1 position ahead of past window
    n_instances=1000
)

# Train the model
try:
    # Process instances and train the model
    for X_past, y_forecast in batcher:
        # Train on past data
        for i in range(len(X_past)):
            x = X_past.iloc[i]
            # Note: y_forecast is a single value, not a Series
            # You would need to use your own past y values or another data source
            
        # Make prediction for the forecast position
        forecast_features = X_past.iloc[-1]  # Use last feature vector for prediction
        y_pred = model.predict_one(forecast_features)
        metric.update(y_forecast, y_pred)
            
    print(f"Final MAE: {metric}")

finally:
    # Clean up
    batcher.stop()

Data Structure

MovingWindowBatcher

For each batch, MovingWindowBatcher returns:

  • X: DataFrame with all instances in the batch
  • y: Series with all targets in the batch

For example, with instance_size=2 and batch_size=2:

  • First batch:
    • X = DataFrame with [x1,x2,x2,x3]
    • y = Series with [y1,y2,y2,y3]
  • Second batch:
    • X = DataFrame with [x2,x3,x3,x4]
    • y = Series with [y2,y3,y3,y4]

PastForecastBatcher

For each instance, PastForecastBatcher returns:

  • X_past: DataFrame with past feature data
  • y_forecast: Single value representing the target at the forecast position

For example, with past_size=3 and forecast_size=0:

  • First instance:
    • X_past = DataFrame with [x1,x2,x3]
    • y_forecast = y4 (value at past_size + forecast_size position)
  • Second instance:
    • X_past = DataFrame with [x2,x3,x4]
    • y_forecast = y5 (value at past_size + forecast_size position)

With past_size=3 and forecast_size=1:

  • First instance:
    • X_past = DataFrame with [x1,x2,x3]
    • y_forecast = y5 (value at past_size + forecast_size position)
  • Second instance:
    • X_past = DataFrame with [x2,x3,x4]
    • y_forecast = y6 (value at past_size + forecast_size position)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fluvialgen-1.0.0.tar.gz (11.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

fluvialgen-1.0.0-py3-none-any.whl (16.1 kB view details)

Uploaded Python 3

File details

Details for the file fluvialgen-1.0.0.tar.gz.

File metadata

  • Download URL: fluvialgen-1.0.0.tar.gz
  • Upload date:
  • Size: 11.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for fluvialgen-1.0.0.tar.gz
Algorithm Hash digest
SHA256 6c9aecf4052beab8f1ac3fdb5508ac26e3502a4679b14f581ce9021fbca6a677
MD5 17d3538473c4edc7f5ed8af7cbbcba41
BLAKE2b-256 3ba6e16931618e8d0125fc62311c21f19b172514fc0342da8f72beceb0fe7600

See more details on using hashes here.

File details

Details for the file fluvialgen-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: fluvialgen-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 16.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for fluvialgen-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 65478d1022b51f345b758d375b419cb31f6298b13be59df11e8d66c38de0783e
MD5 f7ae29ffc3b87c108542a815153c3c55
BLAKE2b-256 d6f8e6db7503b431c4edcb6387f74a2169312e5014f52717fa94bb17c79c1849

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page