Skip to main content

A complete package for data ingestion into the Switch Automation platform.

Project description

Switch Automation library for Python

This is a package for data ingestion into the Switch Automation software platform.

You can find out more about the platform on Switch Automation

Getting started

Prerequisites

Install the package

Install the Switch Automation library for Python with pip:

pip install switch_api

History

0.2.7

Added

  • New method, deploy_as_on_demand_data_feed() added to the Automation class of the pipeline module
    • this new method is only applicable for tasks that subclass the EventWorkOrderTask base class.

Changed

  • The data_feed_id is now a required parameter, not optional, for the following methods on the Automation class of the pipeline module:
    • deploy_on_timer()
    • deploy_as_email_data_feed()
    • deploy_as_ftp_data_feed()
    • deploy_as_upload_data_feed()
  • The email_address_domain is now a required parameter, not optional, for the deploy_as_email_data_feed() method on the Automation class of the pipeline module.

Fixed

  • issue with payload on switch_api.pipeline.Automation.register_task() method for AnalyticsTask and EventWorkOrderTask base classes.

0.2.6

Fixed

  • Fixed issues on 2 methods in the Automation class of the pipeline module:
    • delete_data_feed()
    • cancel_deployed_data_feed()

Added

In the pipeline module:

  • Added new class EventWorkOrderTask
    • This task type is for generation of work orders in 3rd party systems via the Switch Automation platform's Events UI.

Changed

In the pipeline module:

  • AnalyticsTask - added a new method & a new abstract property:
    • analytics_settings_definition abstract property - defines the required inputs (& how these are displayed in the Switch Automation platform UI) for the task to successfully run
    • added check_analytics_settings_valid() method that should be used to validate the analytics_settings dictionary passed to the start() method contains the required keys for the task to successfully run (as defined by the analytics_settings_definition)

In the error_handlers module:

  • In the post_errors() function, the parameter errors_df is renamed to errors and now accepts strings in addition to pandas.DataFrame

Removed

Due to cutover to a new backend, the following have been removed:

  • run_clone_modules() function from the analytics module
  • the entire platform_insights module including the :
    • get_current_insights_by_equipment() function

0.2.5

Added

  • The Automation class of the pipeline module has 2 new methods added: -delete_data_feed()
    • Used to delete an existing data feed and all related deployment settings
    • cancel_deployed_data_feed()
      • used to cancel the specified deployment_type for a given data_feed_id
      • replaces and expands the functionality previously provided in the cancel_deployed_timer() method which has been removed.

Removed

  • Removed the cancel_deployed_timer() method from the Automation class of the pipeline module
    • this functionality is available through the new cancel_deployed_data_feed() method when deployment_type parameter set to ['Timer']

0.2.4

Changed

  • New parameter data_feed_name added to the 4 deployment methods in the pipeline module's Automation class
    • deploy_as_email_data_feed()
    • deploy_as_ftp_data_feed()
    • deploy_as_upload_data_feed()
    • deploy_on_timer()

0.2.3

Fixed

  • Resolved minor issue on register_task() method for the Automation class in the pipeline module.

0.2.2

Fixed

  • Resolved minor issue on upsert_discovered_records() function in integration module related to device-level and sensor-level tags.

0.2.1

Added

  • New class added to the pipeline module
    • DiscoverableIntegrationTask - for API integrations that are discoverable.
      • requires process() & run_discovery() abstract methods to be created when sub-classing
      • additional abstract property, integration_device_type_definition, required compared to base Task
  • New function upsert_discovered_records() added to the integration module
    • Required for the DiscoverableIntegrationTask.run_discovery() method to upsert discovery records to Build - Discovery & Selection UI

Fixed

  • Set minimum msal version required for the switch_api package to be installed.

0.2.0

Major overhaul done of the switch_api package. A complete replacement of the API used by the package was done.

Changed

  • The user_id parameter has been removed from the switch_api.initialise() function.
    • Authentication of the user is now done via Switch Platform SSO. The call to initialise will trigger a web browser window to open to the platform login screen.
      • Note: each call to initialise for a portfolio in a different datacentre will open up browser and requires user to input their username & password.
      • for initialise on a different portfolio within the same datacentre, the authentication is cached so user will not be asked to login again.
  • api_inputs is now a required parameter for the switch_api.pipeline.Automation.register_task()
  • The deploy_on_timer(), deploy_as_email_data_feed(), deploy_as_upload_data_feed(), and deploy_as_ftp_data_feed() methods on the switch_api.pipeline.Automation class have an added parameter: data_feed_id
    • This new parameter allows user to update an existing deployment for the portfolio specified in the api_inputs.
    • If data_feed_id is not supplied, a new data feed instance will be created (even if portfolio already has that task deployed to it)

0.1.18

Changed

  • removed rebuild of the ObjectProperties table in ADX on call to upsert_device_sensors()
  • removed rebuild of the Installation table in ADX on call to upsert_sites()

0.1.17

Fixed

  • Fixed issue with deploy_on_timer() method of the Automation class in the pipeline module.
  • Fixed column header issue with the get_tag_groups() function of the integration module.
  • Fixed missing Meta column on table generated via upsert_workorders() function of the integration module.

Added

  • New method for uploading custom data to blob Blob.custom_upload()

Updated

  • Updated the upsert_device_sensors() to improve performance and aid release of future functionality.

0.1.16

Added

To the pipeline module:

  • New method data_feed_history_process_errors(), to the Automation class.
    • This method returns a dataframe containing the distinct set of error types encountered for a specific data_feed_file_status_id
  • New method data_feed_history_errors_by_type , to the Automation class.
    • This method returns a dataframe containing the actual errors identified for the specified error_type and data_feed_file_status_id

Additional logging was also incorporated in the backend to support the Switch Platform UI.

Fixed

  • Fixed issue with register() method of the Automation class in the pipeline module.

Changed

For the pipeline module:

  • Standardised the following methods of the Automation class to return pandas.DataFrame objects.
  • Added additional error checks to ensure only allowed values are passed to the various Automation class methods for the parameters:
    • expected_delivery
    • deploy_type
    • queue_name
    • error_type

For the integration module:

  • Added additional error checks to ensure only allowed values are passed to post_errors function for the parameters:
    • error_type
    • process_status

For the dataset module:

  • Added additional error check to ensure only allowed values are provided for the query_language parameter of the get_data function.

For the _platform module:

  • Added additional error checks to ensure only allowed values are provided for the account parameter.

0.1.14

Changed

  • updated get_device_sensors() to not auto-detect the data type - to prevent issues such as stripping leading zeroes, etc from metadata values.

0.1.13

Added

To the pipeline module:

  • Added a new method, data_feed_history_process_output, to the Automation class

0.1.11

Changed

  • Update to access to logger - now available as switch_api.pipeline.logger()
  • Update to function documentation

0.1.10

Changed

  • Updated the calculation of min/max date (for timezone conversions) inside the upsert_device_sensors function as the previous calculation method will not be supported in a future release of numpy.

Fixed

  • Fixed issue with retrieval of tag groups and tags via the functions:
    • get_sites
    • get_device_sensors

0.1.9

Added

  • New module platform_insights

In the integration module:

  • New function get_sites added to lookup site information (optionally with site-level tags)
  • New function get_device_sensors added to assist with lookup of device/sensor information, optionally including either metadata or tags.
  • New function get_tag_groups added to lookup list of sensor-level tag groups
  • New function get_metadata_keys added to lookup list of device-level metadata keys

Changed

  • Modifications to connections to storage accounts.
  • Additional parameter queue_name added to the following methods of the Automation class of the pipeline module:
    • deploy_on_timer
    • deploy_as_email_data_feed
    • deploy_as_upload_data_feed
    • deploy_as_ftp_data_feed

Fixed

In the pipeline module:

  • Addressed issue with the schema validation for the upsert_workorders function

0.1.8

Changed

In the integrations module:

  • Updated to batch upserts by DeviceCode to improve reliability & performance of the upsert_device_sensors function.

Fixed

In the analytics module:

  • typing issue that caused error in the import of the switch_api package for python 3.8

0.1.7

Added

In the integrations module:

  • Added new function upsert_workorders
    • Provides ability to ingest work order data into the Switch Automation platform.
    • Documentation provides details on required & optional fields in the input dataframe and also provides information on allowed values for some fields.
    • Two attributes available for function, added to assist with creation of scripts by providing list of required & optional fields:
      • upsert_workorders.df_required_columns
      • upsert_workorders.df_optional_columns
  • Added new function get_states_by_country:
    • Retrieves the list of states for a given country. Returns a dataframe containing both the state name and abbreviation.
  • Added new function get_equipment_classes:
    • Retrieves the list of allowed values for Equipment Class.
      • EquipmentClass is a required field for the upsert_device_sensors function

Changed

In the integrations module:

  • For the upsert_device_sensors function:
    • New attributes added to assist with creation of tasks:
      • upsert_device_sensors.df_required_columns - returns list of required columns for the input df
    • Two new fields required to be present in the dataframe passed to function by parameter df:
      • EquipmentClass
      • EquipmentLabel
    • Fix to documentation so required fields in documentation match.
  • For the upsert_sites function:
    • New attributes added to assist with creation of tasks:
      • upsert_sites.df_required_columns - returns list of required columns for the input df
      • upsert_sites.df_optional_columns - returns list of required columns for the input df
  • For the get_templates function:
    • Added functionality to filter by type via new parameter object_property_type
    • Fixed capitalisation issue where first character of column names in dataframe returned by the function had been converted to lowercase.
  • For the get_units_of_measure function:
    • Added functionality to filter by type via new parameter object_property_type
    • Fixed capitalisation issue where first character of column names in dataframe returned by the function had been converted to lowercase.

In the analytics module:

  • Modifications to type hints and documentation for the functions:
    • get_clone_modules_list
    • run_clone_modules
  • Additional logging added to run_clone_modules

0.1.6

Added

  • Added new function upsert_timeseries_ds() to the integrations module

Changed

  • Additional logging added to invalid_file_format() function from the error_handlers module.

Removed

  • Removed append_timeseries() function

0.1.5

Fixed

  • bug with upsert_sites() function that caused optional columns to be treated as required columns.

Added

Added additional functions to the error_handlers module:

  • validate_datetime() - which checks whether the values of the datetime column(s) of the source file are valid. Any datetime errors identified by this function should be passed to the post_errors() function.
  • post_errors() - used to post errors (apart from those identified by the invalid_file_format() function) to the data feed dashboard.

0.1.4

Changed

Added additional required properties to the Abstract Base Classes (ABC): Task, IntegrationTask, AnalyticsTask, LogicModuleTask. These properties are:

  • Author
  • Version

Added additional parameter query_language to the switch.integration.get_data() function. Allowed values for this parameter are:

  • sql
  • kql

Removed the name_as_filename and treat_as_timeseries parameter from the following functions:

  • switch.integration.replace_data()
  • switch.integration.append_data()
  • switch.integration.upload_data()

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

switch_api-0.2.7.tar.gz (61.4 kB view hashes)

Uploaded Source

Built Distribution

switch_api-0.2.7-py3-none-any.whl (73.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page