A complete package for data ingestion into the Switch Automation Platform.
Project description
Switch Automation library for Python
This is a package for data ingestion into the Switch Automation software platform.
You can find out more about the platform on Switch Automation
Getting started
Prerequisites
- Python 3.8 or later is required to use this package.
- You must have a Switch Automation user account to use this package.
Install the package
Install the Switch Automation library for Python with pip:
pip install switch_api
History
0.4.8
Added
- New class added to the
pipelinemodule:BlobTask- This class is used to create integrations that post data to the Switch Automation Platform using a blob container & Event Hub Queue as the source.- Please Note: This task type requires external setup in Azure by Switch Automation Developers before a task can be registered or deployed.
- requires
process_file()abstract method to be created when sub-classing
- New method,
deploy_as_on_demand_data_feed()added to theAutomationclass of thepipelinemodule- this new method is only applicable for tasks that subclass the
BlobTaskbase class.
- this new method is only applicable for tasks that subclass the
- In the
integrationmodule, new helper methods have been added:connect_to_sql()method creates a pyodbc connection object to enable easier querying of the SQL database via thepyodbclibraryamortise_across_days()method enables easier amortisation of data across days in a period, either inclusive or exclusive of end date.get_metadata_where_clause()method enables creation ofsql_where_clausefor theget_device_sensors() method where for each metadata key the sql checks its not null.
- In the
error_handlersmodule:check_duplicates()method added to check for duplicates & post appropriate errors to Task Insights UI in the Switch Automation platform.
- In the
_utils._utilsmodule:requests_retry_session2helper function added to enable automatic retries of API calls
Updated
-
In the
integrationmodule:- New parameter
include_removed_sitesadded to theget_sites()function.- Determines whether or not to include sites marked as "IsRemoved" in the returned dataframe.
- Defaults to False, indicating removed sites will not be included.
- Updated the
get_device_sensor()method to check if requested metadata keys or requested tag groups exist for the portfolio and exception if they don't. - New parameter
send_notificationadded to theupsert_timeseries()function.- This enables Iq Notification messages to be sent when set to
True - Defaults to
False
- This enables Iq Notification messages to be sent when set to
- For the
get_sites(),get_device_sensors()andget_data()functions, additional parameters have been added to allow customisation of the newly implemented retry logic:retries : int- Number of retries performed beforereturning last retry instance's response status. Max retries = 10. Defaults to 0 currently for backwards compatibility.
backoff_factor- If A backoff factor to apply between attempts after the second try (most errors are resolved immediately by a second try without a delay). {backoff factor} * (2 ** ({retry count} - 1)) seconds
- New parameter
-
In the
error_handlersmodule:- For the
validate_datetimefunction, added two new parameters to enable automatic posting of errors to the Switch Platform:errors: boolean, defaults to False. To enable posting of errors, set to True.api_inputs: defaults to None. Needs to be set to the object returned from switch_api.initialize() iferrors=True.
- For the
Fixed
- In the
integrationmodule:- Resolved outlier scenario resulting in unhandled exception on the
upsert_sites()function. - Minor fix to the
upsert_discovered_records()method to handle the case when unexpected columns are present in the dataframe passed todfinput parameter
- Resolved outlier scenario resulting in unhandled exception on the
0.4.6
Added
- Task Priority and Task Framework data feed deployment settings
- Task Priority and Task Framework are now available to set when deploying data feeds
- Task Priority
- Determines the priority of the datafeed tasks when processing.
- This equates to how much resources would be alloted to run the task
- Available options are:
default,standard, oradvanced.- set to
advancedfor higher resource when processing data feed task
- set to
- Defaults to 'default'.
- Task Framework
- Determines the framework of the datafeed tasks when processing.
- 'PythonScriptFramework' for the old task runner engine.
- 'TaskInsightsEngine' for the new task running in container apps.
- Defaults to 'PythonScriptFramework'
- Determines the framework of the datafeed tasks when processing.
- Task Priority
- Task Priority and Task Framework are now available to set when deploying data feeds
0.4.5
Added
- Email Sender Module
- Send emails to active users within a Portfolio in Switch Automation Platform
- Limitations:
- Emails cannot be sent to users outside of the Portfolio including other users within the platform
- Maximum of five attachments per email
- Each attachment has a maximum size of 5mb
- See function code documentation and usage example below
- New
generate_filepathmethod to provide a filepath where files can be stored- Works well with the attachment feature of the Email Sender Module. Store files in the generated filepath of this method and pass into email attachments
- See function code documentation and usage example below
Email Sender Usage
import switch_api as sw
sw.email.send_email(
api_inputs=api_inputs,
subject='',
body='',
to_recipients=[],
cc_recipients=[], # Optional
bcc_recipients=[], # Optional
attachments=['/file/path/to/attachment.csv'], # Optional
conversation_id='' # Optional
)
generate_filepath Usage
import switch_api as sw
generated_attachment_filepath = sw.generate_filepath(api_inputs=api_inputs, filename='generated_attachment.txt')
# Example of where it could be used
sw.email.send_email(
...
attachments=[generated_attachment_filepath]
...
)
Fixed
- Issue where
upsert_device_sensors_extmethod was not posting metadata and tag_columns to API
0.3.3
Added
- New
upsert_device_sensors_extmethod to theintegrationmodule.- Compared to existing
upsert_device_sensorsfollowing are supported:- Installation Code or Installation Id may be provided
- BUT cannot provide mix of the two, all must have either code or id and not both.
- DriverClassName
- DriverDeviceType
- PropertyName
- Installation Code or Installation Id may be provided
- Compared to existing
Added Feature - Switch Python Extensions
- Extensions may be used in Task Insights and Switch Guides for code reuse
- Extensions maybe located in any directory structure within the repo where the usage scripts are located
- May need to adjust your environment to detect the files if you're not running a project environment
- Tested on VSCode and PyCharm - contact Switch Support for issues.
Extensions Usage
import switch_api as sw
# Single import line per extension
from extensions.my_extension import MyExtension
@sw.extensions.provide(field="some_extension")
class MyTask:
some_extension: MyExtension
if __name__ == "__main__":
task = MyTask()
task.some_extension.do_something()
Extensions Registration
import uuid
import switch_api as sw
class SimpleExtension(sw.extensions.ExtensionTask):
@property
def id(self) -> uuid.UUID:
# Unique ID for the extension.
# Generate in CLI using:
# python -c 'import uuid; print(uuid.uuid4())'
return '46759cfe-68fa-440c-baa9-c859264368db'
@property
def description(self) -> str:
return 'Extension with a simple get_name function.'
@property
def author(self) -> str:
return 'Amruth Akoju'
@property
def version(self) -> str:
return '1.0.1'
def get_name(self):
return "Simple Extension"
# Scaffold code for registration. This will not be persisted in the extension.
if __name__ == '__main__':
task = SimpleExtension()
api_inputs = sw.initialize(api_project_id='<portfolio-id>')
# Usage test
print(task.get_name())
# =================================================================
# REGISTER TASK & DATAFEED ========================================
# =================================================================
register = sw.pipeline.Automation.register_task(api_inputs, task)
print(register)
Updated
- get_data now has an optional parameter to return a pandas.DataFrame or JSON
0.2.27
Fix
- Issue where Timezone DST Offsets API response of
upsert_timeseriesinintegrationmodule was handled incorrectly
0.2.26
Updated
- Optional
table_defparameter onupsert_data,append_data, andreplace_datainintegrationmodule- Enable clients to specify the table structure. It will be merged to the inferred table structure.
list_deploymentsin Automation module now providesSettingsandDriverIdassociated with the deployments
0.2.25
Updated
- Update handling of empty Timezone DST Offsets of
upsert_timeseriesinintegrationmodule
0.2.24
Updated
- Fix default
ingestion_modeparameter value to 'Queue' instead of 'Queued' onupsert_timeseriesinintegrationmodule
0.2.23
Updated
- Optional
ingestion_modeparameter onupsert_timeseriesinintegrationmodule- Include
ingestionModein json payload passed to backend API IngestionModetype must beQueueorStream- Default
ingestion_modeparameter value inupsert_timeseriesisQueue - To enable table streaming ingestion, please contact helpdesk@switchautomation.com for assistance.
- Include
0.2.22
Updated
- Optional
ingestion_modeparameter onupsert_datainintegrationmodule- Include
ingestionModein json payload passed to backend API IngestionModetype must beQueueorStream- Default
ingestion_modeparameter value inupsert_dataisQueue - To enable table streaming ingestion, please contact helpdesk@switchautomation.com for assistance.
- Include
Fix
- sw.pipeline.logger handlers stacking
0.2.21
Updated
- Fix on
get_datamethod indatasetmodule- Sync parameter structure to backend API for
get_data - List of dict containing properties of
name,value, andtypeitems typeproperty must be one of subset of the new LiteralDATA_SET_QUERY_PARAMETER_TYPES
- Sync parameter structure to backend API for
0.2.20
Added
- Newly supported Azure Storage Account: GatewayMqttStorage
- An optional property on QueueTask to specific QueueType
- Default: DataIngestion
0.2.19
Fixed
- Fix on
upsert_timeseriesmethod inintegrationmodule- Normalized TimestampId and TimestampLocalId seconds
- Minor fix on
upsert_entities_affectedmethod inintegrationutils module- Prevent upsert entities affected count when data feed file status Id is not valid
- Minor fix on
get_metadata_keysmethod inintegrationhelper module- Fix for issue when a portfolio does not contain any values in the ApiMetadata table
0.2.18
Added
-
Added new
is_specific_timezoneparameter inupsert_timeseriesmethod ofintegrationmodule-
Accepts a timezone name as the specific timezone used by the source data.
-
Can either be of type str or bool and defaults to the value of False.
-
Cannot have value if 'is_local_time' is set to True.
-
Retrieve list of available timezones using 'get_timezones' method in
integrationmoduleis_specific_timezone is_local_time Description False False Datetimes in provided data is already in UTC and should remain as the value of Timestamp. The TimestampLocal (conversion to site-local Timezone) is calculated. False True Datetimes in provided data is already in the site-local Timezone & should be used to set the value of the TimestampLocal field. The UTC Timestamp is calculated Has Value True NOT ALLOWED Has Value False Both Timestamp and TimestampLocal fields will are calculated. Datetime is converted to UTC then to Local. True NOT ALLOWED '' (empty string) NOT ALLOWED
-
Fixed
- Minor fix on
upsert_tagsandupsert_device_metadatamethods inintegrationmodule- List of required_columns was incorrectly being updated when these functions were called
- Minor fix on
upsert_event_work_order_idmethod inintegrationmodule when attempting to update status of an Event
Updated
- Update on
DiscoveryIntegrationInputnamedtuple - addedjob_id - Update
upsert_discovered_recordsmethod required columns inintegrationmodule- add required
JobIdcolumn for Data Frame parameter
- add required
0.2.17
Fixed
- Fix on
upsert_timeseries()method inintegrationmodule for duplicate records in ingestion files- records whose Timestamp falls in the exact DST start created 2 records with identical values but different TimestampLocal
- one has the TimestampLocal of a DST and the other does not
- records whose Timestamp falls in the exact DST start created 2 records with identical values but different TimestampLocal
Updated
- Update on
get_sites()method inintegrationmodule forInstallationCodecolumn- when the `InstallationCode' value is null in the database it returns an empty string
InstallationCodecolumn is explicity casted to dtype 'str'
0.2.16
Added
- Added new 5 minute interval for
EXPECTED_DELIVERYLiteral inautomationmodule- support for data feed deployments Email, FTP, Upload, and Timer
- usage: expected_delivery='5min'
Fixed
- Minor fix on
upsert_timeseries()method usingdata_feed_file_status_idparameter inintegrationmodule.data_feed_file_status_idparameter value now synced between process records and ingestion files when supplied
Updated
- Reduced ingestion files records chunking by half in
upsert_timeseries()method inintegrationmodule.- from 100k records chunk down to 50k records chunk
0.2.15
Updated
- Optimized
upsert_timeseries()method memory upkeep inintegrationmodule.
0.2.14
Fixed
- Minor fix on
invalid_file_format()method creating structured logs inerror_handlersmodule.
0.2.13
Updated
- Freeze Pandera[io] version to 0.7.1
- PandasDtype has been deprecated since 0.8.0
Compatibility
- Ensure local environment is running Pandera==0.7.1 to match cloud container state
- Downgrade/Upgrade otherwise by running:
- pip uninstall pandera
- pip install switch_api
0.2.12
Added
- Added
upsert_tags()method to theintegrationmodule.- Upsert tags to existing sites, devices, and sensors
- Upserting of tags are categorised by the tagging level which are Site, Device, and Sensor level
- Input dataframe requires `Identifier' column whose value depends on the tagging level specified
- For Site tag level, InstallationIds are expected to be in the
Identifiercolumn - For Device tag level, DeviceIds are expected to be in the
Identifiercolumn - For Sensor tag level, ObjectPropertyIds are expected to be in the
Identifiercolumn
- For Site tag level, InstallationIds are expected to be in the
- Added
upsert_device_metadata()method to theintegrationmodule.- Upsert metadata to existing devices
Usage
upsert_tags()- sw.integration.upsert_tags(api_inputs=api_inputs, df=raw_df, tag_level='Device')
- sw.integration.upsert_tags(api_inputs=api_inputs, df=raw_df, tag_level='Sensor')
- sw.integration.upsert_tags(api_inputs=api_inputs, df=raw_df, tag_level='Site')
upsert_device_metadata()- sw.integration.upsert_device_metadata(api_inputs=api_inputs, df=raw_df)
0.2.11
Added
- New
cachemodule that handles cache data related transactionsset_cachemethod that stores data to cacheget_cachemethod that gets stored data from cache- Stored data can be scoped / retrieved into three categories namely Task, Portfolio, and DataFeed scopes
- For Task scope,
- Data cache can be retrieved by any Portfolio or Datafeed that runs in same Task
- provide TaskId (self.id when calling from the driver)
- For DataFeed scope,
- Data cache can be retrieved (or set) within the Datafeed deployed in portfolio
- Provide UUID4 for local testing. api_inputs.data_feed_id will be used when running in the cloud.
- For Portfolio scope:
- Data cache can be retrieved (or set) by any Datafeed deployed in portfolio
- scope_id will be ignored and api_inputs.api_project_id will be used.
- For Task scope,
0.2.10
Fixed
- Fixed issue with
upsert_timeseries_ds()method in theintegrationmodule where required fields such asTimestamp,ObjectPropertyId,Valuewere being removed.
0.2.9
Added
- Added
upsert_timeseries()method to theintegrationmodule.- Data ingested into table storage in addition to ADX Timeseries table
- Carbon calculation performed where appropriate
- Please note: If carbon or cost are included as fields in the
Metacolumn then no carbon / cost calculation will be performed
- Please note: If carbon or cost are included as fields in the
Changed
- Added
DriverClassNameto required columns forupsert_discovered_records()method in theintegrationmodule
Fixed
- A minor fix to 15-minute interval in
upsert_timeseries_ds()method in theintegrationmodule.
0.2.8
Changed
- For the
EventWorkOrderTaskclass in thepipelinemodule, thecheck_work_order_input_valid()and thegenerate_work_order()methods expect an additional 3 keys to be included by default in the dictionary passed to thework_order_inputparameter:InstallationIdEventLinkEventSummary
Fixed
- Issue with the header/payload passed to the API within the
upsert_event_work_order_id()function of theintegrationmodule.
0.2.7
Added
- New method,
deploy_as_on_demand_data_feed()added to theAutomationclass of thepipelinemodule- this new method is only applicable for tasks that subclass the
EventWorkOrderTaskbase class.
- this new method is only applicable for tasks that subclass the
Changed
- The
data_feed_idis now a required parameter, not optional, for the following methods on theAutomationclass of thepipelinemodule:deploy_on_timer()deploy_as_email_data_feed()deploy_as_ftp_data_feed()deploy_as_upload_data_feed()
- The
email_address_domainis now a required parameter, not optional, for thedeploy_as_email_data_feed()method on theAutomationclass of thepipelinemodule.
Fixed
- issue with payload on
switch_api.pipeline.Automation.register_task()method forAnalyticsTaskandEventWorkOrderTaskbase classes.
0.2.6
Fixed
- Fixed issues on 2 methods in the
Automationclass of thepipelinemodule:delete_data_feed()cancel_deployed_data_feed()
Added
In the pipeline module:
- Added new class
EventWorkOrderTask- This task type is for generation of work orders in 3rd party systems via the Switch Automation Platform's Events UI.
Changed
In the pipeline module:
AnalyticsTask- added a new method & a new abstract property:analytics_settings_definitionabstract property - defines the required inputs (& how these are displayed in the Switch Automation Platform UI) for the task to successfully run- added
check_analytics_settings_valid()method that should be used to validate theanalytics_settingsdictionary passed to thestart()method contains the required keys for the task to successfully run (as defined by theanalytics_settings_definition)
In the error_handlers module:
- In the
post_errors()function, the parametererrors_dfis renamed toerrorsand now accepts strings in addition to pandas.DataFrame
Removed
Due to cutover to a new backend, the following have been removed:
run_clone_modules()function from theanalyticsmodule- the entire
platform_insightsmodule including the :get_current_insights_by_equipment()function
0.2.5
Added
- The
Automationclass of thepipelinemodule has 2 new methods added: -delete_data_feed()- Used to delete an existing data feed and all related deployment settings
cancel_deployed_data_feed()- used to cancel the specified
deployment_typefor a givendata_feed_id - replaces and expands the functionality previously provided in the
cancel_deployed_timer()method which has been removed.
- used to cancel the specified
Removed
- Removed the
cancel_deployed_timer()method from theAutomationclass of thepipelinemodule- this functionality is available through the new
cancel_deployed_data_feed()method whendeployment_typeparameter set to['Timer']
- this functionality is available through the new
0.2.4
Changed
- New parameter
data_feed_nameadded to the 4 deployment methods in thepipelinemodule'sAutomationclassdeploy_as_email_data_feed()deploy_as_ftp_data_feed()deploy_as_upload_data_feed()deploy_on_timer()
0.2.3
Fixed
- Resolved minor issue on
register_task()method for theAutomationclass in thepipelinemodule.
0.2.2
Fixed
- Resolved minor issue on
upsert_discovered_records()function inintegrationmodule related to device-level and sensor-level tags.
0.2.1
Added
- New class added to the
pipelinemoduleDiscoverableIntegrationTask- for API integrations that are discoverable.- requires
process()&run_discovery()abstract methods to be created when sub-classing - additional abstract property,
integration_device_type_definition, required compared to baseTask
- requires
- New function
upsert_discovered_records()added to theintegrationmodule- Required for the
DiscoverableIntegrationTask.run_discovery()method to upsert discovery records to Build - Discovery & Selection UI
- Required for the
Fixed
- Set minimum msal version required for the switch_api package to be installed.
0.2.0
Major overhaul done of the switch_api package. A complete replacement of the API used by the package was done.
Changed
- The
user_idparameter has been removed from theswitch_api.initialise()function.- Authentication of the user is now done via Switch Platform SSO. The call to initialise will trigger a web browser
window to open to the platform login screen.
- Note: each call to initialise for a portfolio in a different datacentre will open up browser and requires user to input their username & password.
- for initialise on a different portfolio within the same datacentre, the authentication is cached so user will not be asked to login again.
- Authentication of the user is now done via Switch Platform SSO. The call to initialise will trigger a web browser
window to open to the platform login screen.
api_inputsis now a required parameter for theswitch_api.pipeline.Automation.register_task()- The
deploy_on_timer(),deploy_as_email_data_feed(),deploy_as_upload_data_feed(), anddeploy_as_ftp_data_feed()methods on theswitch_api.pipeline.Automationclass have an added parameter:data_feed_id- This new parameter allows user to update an existing deployment for the portfolio specified in the
api_inputs. - If
data_feed_idis not supplied, a new data feed instance will be created (even if portfolio already has that task deployed to it)
- This new parameter allows user to update an existing deployment for the portfolio specified in the
0.1.18
Changed
- removed rebuild of the ObjectProperties table in ADX on call to
upsert_device_sensors() - removed rebuild of the Installation table in ADX on call to
upsert_sites()
0.1.17
Fixed
- Fixed issue with
deploy_on_timer()method of theAutomationclass in thepipelinemodule. - Fixed column header issue with the
get_tag_groups()function of theintegrationmodule. - Fixed missing Meta column on table generated via
upsert_workorders()function of theintegrationmodule.
Added
- New method for uploading custom data to blob
Blob.custom_upload()
Updated
- Updated the
upsert_device_sensors()to improve performance and aid release of future functionality.
0.1.16
Added
To the pipeline module:
- New method
data_feed_history_process_errors(), to theAutomationclass.- This method returns a dataframe containing the distinct set of error types encountered for a specific
data_feed_file_status_id
- This method returns a dataframe containing the distinct set of error types encountered for a specific
- New method
data_feed_history_errors_by_type, to theAutomationclass.- This method returns a dataframe containing the actual errors identified for the specified
error_typeanddata_feed_file_status_id
- This method returns a dataframe containing the actual errors identified for the specified
Additional logging was also incorporated in the backend to support the Switch Platform UI.
Fixed
- Fixed issue with
register()method of theAutomationclass in thepipelinemodule.
Changed
For the pipeline module:
- Standardised the following methods of the
Automationclass to return pandas.DataFrame objects. - Added additional error checks to ensure only allowed values are passed to the various
Automationclass methods for the parameters:expected_deliverydeploy_typequeue_nameerror_type
For the integration module:
- Added additional error checks to ensure only allowed values are passed to
post_errorsfunction for the parameters:error_typeprocess_status
For the dataset module:
- Added additional error check to ensure only allowed values are provided for the
query_languageparameter of theget_datafunction.
For the _platform module:
- Added additional error checks to ensure only allowed values are provided for the
accountparameter.
0.1.14
Changed
- updated get_device_sensors() to not auto-detect the data type - to prevent issues such as stripping leading zeroes, etc from metadata values.
0.1.13
Added
To the pipeline module:
- Added a new method,
data_feed_history_process_output, to theAutomationclass
0.1.11
Changed
- Update to access to
logger- now available asswitch_api.pipeline.logger() - Update to function documentation
0.1.10
Changed
- Updated the calculation of min/max date (for timezone conversions) inside the
upsert_device_sensorsfunction as the previous calculation method will not be supported in a future release of numpy.
Fixed
- Fixed issue with retrieval of tag groups and tags via the functions:
get_sitesget_device_sensors
0.1.9
Added
- New module
platform_insights
In the integration module:
- New function
get_sitesadded to lookup site information (optionally with site-level tags) - New function
get_device_sensorsadded to assist with lookup of device/sensor information, optionally including either metadata or tags. - New function
get_tag_groupsadded to lookup list of sensor-level tag groups - New function
get_metadata_keysadded to lookup list of device-level metadata keys
Changed
- Modifications to connections to storage accounts.
- Additional parameter
queue_nameadded to the following methods of theAutomationclass of thepipelinemodule:deploy_on_timerdeploy_as_email_data_feeddeploy_as_upload_data_feeddeploy_as_ftp_data_feed
Fixed
In the pipeline module:
- Addressed issue with the schema validation for the
upsert_workordersfunction
0.1.8
Changed
In the integrations module:
- Updated to batch upserts by DeviceCode to improve reliability & performance of the
upsert_device_sensorsfunction.
Fixed
In the analytics module:
- typing issue that caused error in the import of the switch_api package for python 3.8
0.1.7
Added
In the integrations module:
- Added new function
upsert_workorders- Provides ability to ingest work order data into the Switch Automation Platform.
- Documentation provides details on required & optional fields in the input dataframe and also provides information on allowed values for some fields.
- Two attributes available for function, added to assist with creation of scripts by providing list of required &
optional fields:
upsert_workorders.df_required_columnsupsert_workorders.df_optional_columns
- Added new function
get_states_by_country:- Retrieves the list of states for a given country. Returns a dataframe containing both the state name and abbreviation.
- Added new function
get_equipment_classes:- Retrieves the list of allowed values for Equipment Class.
- EquipmentClass is a required field for the upsert_device_sensors function
- Retrieves the list of allowed values for Equipment Class.
Changed
In the integrations module:
- For the
upsert_device_sensorsfunction:- New attributes added to assist with creation of tasks:
upsert_device_sensors.df_required_columns- returns list of required columns for the inputdf
- Two new fields required to be present in the dataframe passed to function by parameter
df:EquipmentClassEquipmentLabel
- Fix to documentation so required fields in documentation match.
- New attributes added to assist with creation of tasks:
- For the
upsert_sitesfunction:- New attributes added to assist with creation of tasks:
upsert_sites.df_required_columns- returns list of required columns for the inputdfupsert_sites.df_optional_columns- returns list of required columns for the inputdf
- New attributes added to assist with creation of tasks:
- For the
get_templatesfunction:- Added functionality to filter by type via new parameter
object_property_type - Fixed capitalisation issue where first character of column names in dataframe returned by the function had been converted to lowercase.
- Added functionality to filter by type via new parameter
- For the
get_units_of_measurefunction:- Added functionality to filter by type via new parameter
object_property_type - Fixed capitalisation issue where first character of column names in dataframe returned by the function had been converted to lowercase.
- Added functionality to filter by type via new parameter
In the analytics module:
- Modifications to type hints and documentation for the functions:
get_clone_modules_listrun_clone_modules
- Additional logging added to
run_clone_modules
0.1.6
Added
- Added new function
upsert_timeseries_ds()to theintegrationsmodule
Changed
- Additional logging added to
invalid_file_format()function from theerror_handlersmodule.
Removed
- Removed
append_timeseries()function
0.1.5
Fixed
- bug with
upsert_sites()function that caused optional columns to be treated as required columns.
Added
Added additional functions to the error_handlers module:
validate_datetime()- which checks whether the values of the datetime column(s) of the source file are valid. Any datetime errors identified by this function should be passed to thepost_errors()function.post_errors()- used to post errors (apart from those identified by theinvalid_file_format()function) to the data feed dashboard.
0.1.4
Changed
Added additional required properties to the Abstract Base Classes (ABC): Task, IntegrationTask, AnalyticsTask, LogicModuleTask. These properties are:
- Author
- Version
Added additional parameter query_language to the switch.integration.get_data() function. Allowed values for this
parameter are:
sqlkql
Removed the name_as_filename and treat_as_timeseries parameter from the following functions:
switch.integration.replace_data()switch.integration.append_data()switch.integration.upload_data()
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file switch_api-0.4.8.tar.gz.
File metadata
- Download URL: switch_api-0.4.8.tar.gz
- Upload date:
- Size: 98.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8238330cf97cb4f5facdeec6bd7e4a9c7d92938bc76e740c83abb88c7b3c98df
|
|
| MD5 |
905e01cec42cbcbe2d5beed3354de80c
|
|
| BLAKE2b-256 |
59b458081caed46fae925270033f86f5b88ee11223e3f276c84f0c76c86d71c0
|
File details
Details for the file switch_api-0.4.8-py3-none-any.whl.
File metadata
- Download URL: switch_api-0.4.8-py3-none-any.whl
- Upload date:
- Size: 100.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
882a78a479b5545bc04308ed6adcc12b6ba368753cc0dec06b548c5215c42032
|
|
| MD5 |
9e4a1ce38191dd4bb8d02a0406fc165a
|
|
| BLAKE2b-256 |
c7c0a4e2799d0d3c02fd0474eeae752df1daa9bd06704cd95b9235454591acb9
|