Skip to main content

Enhancement of URBANopt GeoJSON that can be consumed by DiTTo reader

Project description

urbanopt-ditto-reader

Enhancement of URBANopt™ GeoJSON that can be consumed by DiTTo reader
More detailed documentation is available on the URBANopt documentation page

Installation Pre-requisites

  • Python >=3.8
  • Requires Python 3.10 if using via the URBANopt CLI.

Installation

pip install urbanopt-ditto-reader

Running the converter

You are expected to have an existing URBANopt project dir with successful simulations of electrical network components before using this package.

Use the included Command Line Interface:

ditto_reader_cli -h

For help text in the terminal:

ditto_reader_cli run-opendss -h

Example command to run the ditto-reader:

ditto_reader_cli run-opendss -s <ScenarioFile> -f <FeatureFile>

Or build and use a config file (not necessary if using flags like the above example):

ditto_reader_cli run-opendss -c urbanopt_ditto_reader/example_config.json

If you are using your own config.json file, use the following fields:

  1. "urbanopt_scenario_file": Required, Path to scenario csv file
  2. "urbanopt_geojson_file": Required, Path to feature json file
  3. "equipment_file": Optional, Path to custom equipment file. If not specified, the 'extended_catalog.json' file will be used
  4. "opendss_folder": Required, Path to dir created by this command, holding openDSS output
  5. "use_reopt": Required, Boolean (True/False) to analyze reopt data, if it has been provided
  6. "start_date": Optional, String, Indicates the start date of the simulation. Uses format "YYYY/MM/DD"
  7. "start_time": Optional, String, Indicates the start time of the simulation. Uses format "HH:MM:SS". The start_date and start_time are concatenated to get the timestamp (using format "YYYY/MM/DD HH:MM:SS") for the config file that is cross referenced with the timestamps in the SCENARIO_NAME/opendss/profiles/timestamps.csv file created from profiles in SCENARIO_NAME/FEATURE_ID/feature_reports/feature_report_reopt.csv if use_reopt is true and SCENARIO_NAME/FEATURE_ID/feature_reports/default_feature_report.csv if use_reopt is false. It assumes start_time to be 00:00:00 if start_date is found but no start_time. It runs the entire year if timestamp not found.
  8. "end_date": Optional, String, Indicates the end date of the simulation. Uses format "YYYY/MM/DD"
  9. "end_time": Optional, String, Indicates the end time of the simulation. Uses format "HH:MM:SS". The end_date and end_time are concatenated to get the timestamp (using format "YYYY/MM/DD HH:MM:SS") for the config file and is cross referenced with the timestamps in the SCENARIO_NAME/opendss/profiles/timestamps.csv file created from profiles in SCENARIO_NAME/FEATURE_ID/feature_reports/feature_report_reopt.csv if use_reopt is true and SCENARIO_NAME/FEATURE_ID/feature_reports/default_feature_report.csv if use_reopt is false. It assumes end_time to be 23:00:00 if end_date is found but no end_time. It runs the entire year if timestamp not found.
  10. "timestep": Optional, Float number of minutes between each simulation. If smaller than timesteps (or not an even multiple) provided by the reopt feature reports (if use_repot is true), or urbanopt feature reports (if use_reopt is false), an error is raised
  11. "upgrade_transformers": Optional, Boolean (True/False). If true, will automatically upgrade transformers that are sized smaller than the sum of the peak loads that it serves. Does not update geojson file - just opendss output files

If either start_time and end_time are invalid or set to None, the simulation will be run for all timepoints provided by the reopt simulation (if use_reopt is true) or urbanopt simulation (if use_reopt is false)

Developer installation

  • Clone the repository: git clone https://github.com/urbanopt/urbanopt-ditto-reader.git
  • Change directories into the repository: cd urbanopt-ditto-reader
  • As general guidance, we recommend using virtual environments to avoid dependencies colliding between your Python projects. venv is the Python native solution that will work everywhere, though other options may be more user-friendly.
    • Some popular alternatives are:
    • Activate pre-commit (only once, after making a new venv): pre-commit install
    • Runs automatically on your staged changes before every commit
  • Update pip and setuptools: pip install -U pip setuptools
  • Install the respository with developer dependencies: pip install -e .[dev]
  • To check the whole repo, run pre-commit run --all-files
    • Settings and documentation links for pre-commit and ruff are in .pre-commit-config.yaml and pyproject.toml

Releasing

Increment the version in pyproject.toml. Use semantic versioning. When a new release is made in GitHub, a workflow automatically publishes to PyPI.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

urbanopt-ditto-reader-0.6.2.tar.gz (42.3 kB view details)

Uploaded Source

Built Distribution

urbanopt_ditto_reader-0.6.2-py3-none-any.whl (43.4 kB view details)

Uploaded Python 3

File details

Details for the file urbanopt-ditto-reader-0.6.2.tar.gz.

File metadata

  • Download URL: urbanopt-ditto-reader-0.6.2.tar.gz
  • Upload date:
  • Size: 42.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.7

File hashes

Hashes for urbanopt-ditto-reader-0.6.2.tar.gz
Algorithm Hash digest
SHA256 cc69d02135d38a9835a4105243fecbc71fe63dcf07210e6be97c36ad16bf9db3
MD5 d4369e2a7a727144fcb41f00f4c15286
BLAKE2b-256 25c6b264a047c8a98346a2abae3c939841efd0899fda9cfc2f559bd509419c3a

See more details on using hashes here.

File details

Details for the file urbanopt_ditto_reader-0.6.2-py3-none-any.whl.

File metadata

File hashes

Hashes for urbanopt_ditto_reader-0.6.2-py3-none-any.whl
Algorithm Hash digest
SHA256 4543b49e4e6f856ccd74f40ec87226715c4366a2b09c6b40c7e04772aab775c6
MD5 0575db45b944f569cbe40fa74425608b
BLAKE2b-256 588dfb73c5cdbf0d5805f427cd3bde03db48203bf41274d441bdb59a0a525225

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page