Skip to main content

Quantitative analysis for power markets

Project description

octoanalytics logo

License

octoanalytics is an Python package by Octopus Energy that provides tools for quantitative analysis and risk calculation on energy data. It helps to analyze time series energy consumption data, extract relevant features, and predict future consumption using machine learning models.

Key Features

  • Time-based Feature Engineering: Extract hourly, daily, and yearly features, as well as detect holidays using a calendar.
  • Forecasting Model: Utilizes XGBoost regression models to predict hourly energy consumption.
  • Model Evaluation: Computes MAPE (Mean Absolute Percentage Error) on the validation and test datasets.

Installation

To install octoanalytics, you can use pip:

pip install octoanalytics

Requirements

  • Python 3.7 or higher
  • pandas
  • numpy
  • xgboost
  • sklearn
  • holidays

These dependencies will be automatically installed when you install octoanalytics.

Usage

1. Importing the package

To use octoanalytics, import the eval_forecast module as shown below:

from octoanalytics import eval_forecast

2. Input Data Format

The data required for the function must be a DataFrame with the following columns:

  • 'date': A column containing date-time values in datetime format.
  • 'consumption': A column containing energy consumption values (the target variable).

Example of how the input data should look:

import pandas as pd

data = pd.DataFrame({
    'date': ['2025-01-01 00:00', '2025-01-01 01:00', '2025-01-01 02:00', ...],
    'consumption': [120.5, 115.3, 113.7, ...]
})

data['date'] = pd.to_datetime(data['date'])

3. Main Function: eval_forecast

The eval_forecast function trains a machine learning model to forecast energy consumption using XGBoost. Here's how to use it:

model, y_test_pred, y_test, test_mape, y_val_pred, val_mape = eval_forecast(data, country_code='FR')

Parameters

  • data (pd.DataFrame): A DataFrame containing the columns date and consumption.
  • country_code (str): The ISO code for the country to detect holidays (default is 'FR' for France).

Return Values

  • model: The trained XGBoost model.
  • y_test_pred: The model's predictions on the test set.
  • y_test: The actual values of the test set.
  • test_mape: The Mean Absolute Percentage Error (MAPE) of the model on the test set.
  • y_val_pred: The model's predictions on the validation set.
  • val_mape: The MAPE of the model on the validation set.

4. Example Usage

import pandas as pd
from octoanalytics import eval_forecast

# Example data (replace with your actual dataset)
data = pd.DataFrame({
    'date': ['2025-01-01 00:00', '2025-01-01 01:00', '2025-01-01 02:00'],
    'consumption': [120.5, 115.3, 113.7]
})
data['date'] = pd.to_datetime(data['date'])

# Run the forecast function
model, y_test_pred, y_test, test_mape, y_val_pred, val_mape = eval_forecast(data)

# Print the results
print(f"Validation MAPE: {val_mape:.2f}%")
print(f"Test MAPE: {test_mape:.2f}%")

Detailed Description of eval_forecast

The eval_forecast function is used to train a forecasting model for energy consumption using the XGBoost algorithm. Here's how it works:

  1. Data Preprocessing: The function extracts time-based features such as hour, day of the week, month, year, and week of the year. It also adds a binary feature indicating whether a given date is a holiday in the specified country.

  2. Data Splitting: The data is split into three sets:

    • Training set: 60% of the data.
    • Validation set: 20% of the data.
    • Test set: 20% of the data.
  3. Training the XGBoost Model: The model is trained on the training set, with early stopping based on validation data to prevent overfitting.

  4. Model Evaluation: The MAPE (Mean Absolute Percentage Error) is computed on both the validation and test sets.

XGBoost Model Parameters

  • n_estimators: The number of boosting rounds (default is 100).
  • learning_rate: The learning rate for adjusting tree weights (default is 0.1).
  • max_depth: The maximum depth of the decision trees (default is 5).

These parameters can be adjusted by modifying the call to the XGBRegressor model in the eval_forecast function.

Model Evaluation

The MAPE (Mean Absolute Percentage Error) is calculated on both the validation and test sets. It is expressed as a percentage and provides an indication of how well the model is performing. A lower MAPE value indicates better model performance.

Developer

License

This project is licensed under the MIT License. See the LICENSE file for more details.

Contributions

Contributions are welcome! If you would like to suggest a feature or report a bug, please open an issue or submit a pull request.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

octoanalytics-0.0.3.tar.gz (5.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

octoanalytics-0.0.3-py3-none-any.whl (5.3 kB view details)

Uploaded Python 3

File details

Details for the file octoanalytics-0.0.3.tar.gz.

File metadata

  • Download URL: octoanalytics-0.0.3.tar.gz
  • Upload date:
  • Size: 5.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.17

File hashes

Hashes for octoanalytics-0.0.3.tar.gz
Algorithm Hash digest
SHA256 4d4fab5b1b98efe7d26af2e354f36c1420fa4bcaff9f6a671f25eb3617aa7fc5
MD5 59438dd06d2e70b5a39c39a6f87a5fb8
BLAKE2b-256 77a94a2234f577c722b1b8b81ae0d73688827900e0213b5550fd19e1d3b200a0

See more details on using hashes here.

File details

Details for the file octoanalytics-0.0.3-py3-none-any.whl.

File metadata

  • Download URL: octoanalytics-0.0.3-py3-none-any.whl
  • Upload date:
  • Size: 5.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.17

File hashes

Hashes for octoanalytics-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 a36b7497d865a6b87bec83a0114f91f43b64e97239a6d7237e381f7dd3e44361
MD5 afef581f1a5058aae2a51ef11f13ddf0
BLAKE2b-256 9331bcbf0258ac442724b9ee1cf842c99c5354b61bb989c089a8e2b5167b6e2e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page