Skip to main content

A Python package to read and write files in CDM format. Customized for SkyPoint use cases.

Project description

skypoint-python-cdm-connector

Python Spark CDM Connector by SkyPoint.

A Apache Spark data source for the Microsoft Azure "Common Data Model". Reading and writing is supported and it is a work in progress. Library is in early stage and making progress each weekly sprint. Please file issues for any bugs that you find.

For more information about the Azure Common Data Model, check out this page.

We support Azure Data Lake Service (ADLS) as storage, historical data preservation using snapshots of the schema & data files and usage within PySpark, Azure Functions etc.

*Upcoming Support for incremental data refresh handling, CDM 1.1, AWS (S3) and Google Cloud (Cloud Storage).

Example

  1. Please look into the sample usage file skypoint_python_cdm.py
  2. Dynamically add/remove entities, annotations and attributes
  3. Pass Reader and Writer object for any storage account you like to write/read data to/from.
  4. Check out the below code for basic read and write examples.
# Initialize empty model
m = Model()

# Sample dataframe
df = {"country": ["Brazil", "Russia", "India", "China", "South Africa", "ParaSF"],
       "currentTime": [datetime.now(), datetime.now(), datetime.now(), datetime.now(), datetime.now(), datetime.now()],
       "area": [8.516, 17.10, 3.286, 9.597, 1.221, 2.222],
       "capital": ["Brasilia", "Moscow", "New Dehli", "Beijing", "Pretoria", "ParaSF"],
       "population": [200.4, 143.5, 1252, 1357, 52.98, 12.34] }
df = pd.DataFrame(df)

# Generate entity from the dataframe
entity = Model.generate_entity(df, "customEntity")

# Add generated entity to model
m.add_entity(entity)

# Add model level annotation
# Annotation can be added at entity level as well as attribute level
Model.add_annotation("modelJsonAnnotation", "modelJsonAnnotationValue", m)


# Create an ADLSWriter to write into ADLS
writer = ADLSWriter("ACCOUNT_NAME", "ACCOUNT_KEY",
                     "CONTAINER_NAME", "STORAGE_NAME", "DATAFLOW_NAME")    

# Write data as well as model.json in ADLS storage
m.write_to_storage("customEntity", df, writer)

Contributing

This project welcomes contributions and suggestions.

References

Model.json version1 schema

A clean implementation for Python Objects from/to model.json file

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cdm-connector-0.0.6.22.tar.gz (15.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cdm_connector-0.0.6.22-py3-none-any.whl (22.9 kB view details)

Uploaded Python 3

File details

Details for the file cdm-connector-0.0.6.22.tar.gz.

File metadata

  • Download URL: cdm-connector-0.0.6.22.tar.gz
  • Upload date:
  • Size: 15.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.7.5

File hashes

Hashes for cdm-connector-0.0.6.22.tar.gz
Algorithm Hash digest
SHA256 79cfd6daca86aad6b33b511a7506b75331faa8f6709106333b8a4f758fc83a45
MD5 b97952fad54e960fca9b859ab91fcd0c
BLAKE2b-256 93c4c852854e418f1f8487263e52eb4e360f68b2f5f901a67e1bb4e60854a390

See more details on using hashes here.

File details

Details for the file cdm_connector-0.0.6.22-py3-none-any.whl.

File metadata

  • Download URL: cdm_connector-0.0.6.22-py3-none-any.whl
  • Upload date:
  • Size: 22.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.7.5

File hashes

Hashes for cdm_connector-0.0.6.22-py3-none-any.whl
Algorithm Hash digest
SHA256 21c74e2b7e72d8bac02e74d33edd084db25929f85536bc5a289df8984ffdc434
MD5 586ab0b81ca7bf1cd1a231063cad2190
BLAKE2b-256 faf01c91643f1838b3e5812b2948e5b33706a2d75e2ed38d84442ac0a556f512

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page