Skip to main content

Smartsheet Python SDK and Pydantic Wrapper

Project description

smartsheet-pydantic

What is it?

Smartsheet Python SDK wrapper, incorporating Pydantic models for type validation to guarantee successful data read/updates into Smartsheet.

Smartsheet API requires the incoming data to comply with Smartsheet column types. When the types do not align then the write call can fail. In order streamline the data validation this package piggybacks off the Pydantic BaseModel structure and their data validation capabilities as a "SmartModel".

This package also provides frameworks to create a "DataSource" class which you can declaratively define your data source and attach it to the SmartModel, so you can have a standard method of querying your datasource and create/update into Smartsheet.


Getting Started

Install & Import The Module

requires python greater than 3.10 Standard installation using pip install, and package import.

pip install smartsheet-pydantic
# The package name uses underscore "_" instead of hyphen
import smartsheet_pydantic

Everything revolves around the Controller

smartsheet_pydantic package's function revolves around the Controller. Once a Controller is setup, it can facilitate data extraction from the user designated data source (RESTful API or PostgreSQL) and refresh the designated Smartsheet using an additive approach (data is updated or added, but never deleted). The Controller can also facilitate writing into the Smartsheet.

During both the write, read, or update of the Smartsheet the smartsheet_pydantic as the name suggest uses Pydantic's BaseModel to validate the data. This ensures the data has the write data type and will not fail to write to the given Smartsheet.

Defining the Controller

In order to generate the Controller we will use the SheetDetail class to collect the necessary details.

  1. sheet_id: Every Smartsheet grid has a unique id, this is used to target the Smartsheet.
  2. description: User defined description of the target Smartsheet to give some context. Can be anything.
  3. smart_model: A smartsheet-pydantic model we will create to aid in data validation (Extension of Pydantic BaseModel).
  4. source: default None. This field is used to override a Smartmodels default DataSource for unit testing purposes. Omit for normal use.
from smartsheet_pydantic.controller import SheetDetail
from smartsheet_pydantic.smartmodels import SmartModel
from smartsheet_pydantic.sources import DataSource


sheet_detail = SheetDetail(
    sheet_id: int = 0123456789
    description: str = "Description of the target Smartsheet"
    smart_model: SmartModel = WeatherModel # we will create this in the subsequent steps
)

Defining a DataSource

There are options when creating a data source. One is a RESTful API endpoint, where the you can extend the DataSourceAPI class, and provide a url attribute. The other is a PostgreSQL database source using DataSourcePostgres, where you can provide the access details as well as the query to use, and the column name designations. We will use the latter in this example.

from smartsheet_pydantic.sources import DataSourceAPI, DataSourcePostgres


# DataSource class to call a RESTful API endpoint.
class WeatherRestAPI(DataSourceAPI):
    url = 'http://127.0.0.1:8000/weather_data'


# DataSource class for PostgreSQL database
class ExamplePostgres(DataSourcePostgres):
    user: str          = "username"
    password: str      = "password"
    host: str          = "host name"
    database: str      = "database name"
    query: str         = 'SELECT * FROM weather_table WHERE location = "USA"'
    columns: list[str] = [ "index", "date", "temperature", "humidity", "rain" ]

Defining a SmartModel

A SmartModel class is an extension of Pydantic BaseModel. Therefore the definition of a SmartModel is very similar. Define the data fields and their type as a class attribute.

You will also need to define 2 additional Configuration parameters.

  1. source: this is the DataSource class you defined in the previous step which is associated with this SmartModel.
  2. unique_key: is a list of column names which the model will use to define uniqueness. When data in Smartsheet is updated these columns will be used to find and update the data if the data already exists.
  3. key_mapping: if you have defined the data field names that differs from the source data, then this dictionary can be used to map and rename those columns. If the data field names are the same, the value must be set to None.
from smartsheet_pydantic.smartmodels import SmartModel
from datetime import date


class WeatherModel(SmartModel):

    index: int
    date: date
    temperature: float
    humidity: float
    rain: bool

    class Configuration
        source = WeatherRestAPI
        unique_columns: list[int] = ['index']
        key_mapping: dict = None

Generating the controller

Now that all of the components are ready we will now take the SheetDetail instance, and provide it to the SmartsheetControllerFactory class to generate a ControllerFactory. By calling on the .get_controller() method we can generate the SmarsheetController object.

from smartsheet_pydantic.controller import SmartsheetController, SmartsheetControllerFactory

controller_factory = SmartsheetControllerFactory()
controller: SmartsheetController = controller_factory.get_controller(sheet_detail)

Using the controller

Use the provided data source to refresh the target Smartsheet

controller.refresh_smartsheet_from_source()

Extracting data from Smartsheet into a list of SmartModel instances.

data: list[WeatherData] = controller.extract_as_smartmodel()

Manually write new data, or update existing data to Smartsheet

data: list[WeatherData]
controller.update_rows(data)

Delete all row data from Smartsheet

controller.delete_all_rows()

SmartModel

Not all data can be handled within the SmartModel. You may want to manipulate the data using Pandas or Numpy. Therefore SmartModel has methods to aid in the transforming of data into a SmartModel, as well as extracting data out into a dictionary.

Extracting Data From SmartModel

weather_data: list[WeatherData]
extracted_data: list[dict] = [smart_model.dict() for smart_model in weather_data]

print(extracted_data)

Results

[
    {
        'index': 1
        'date': date(2023, 1, 1)
        'temperature': 65.2
        'humidity': 14.5
        'rain': False
    },
    {
        'index': 2
        'date': date(2023, 1, 2)
        'temperature': 67.2
        'humidity': 14.2
        'rain': False
    },
    {
        'index': 1
        'date': date(2023, 1, 3)
        'temperature': 62.3
        'humidity': 12.2
        'rain': False
    },
]

Converting Data Into SmartModel

extracted_data: list[dict]

weather_data: list[WeatherData] = \
    [WeatherData.from_source(data) for data in extracted_data]

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

smartsheet-pydantic-1.4.4.tar.gz (18.2 kB view details)

Uploaded Source

Built Distribution

smartsheet_pydantic-1.4.4-py3-none-any.whl (13.6 kB view details)

Uploaded Python 3

File details

Details for the file smartsheet-pydantic-1.4.4.tar.gz.

File metadata

  • Download URL: smartsheet-pydantic-1.4.4.tar.gz
  • Upload date:
  • Size: 18.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.6

File hashes

Hashes for smartsheet-pydantic-1.4.4.tar.gz
Algorithm Hash digest
SHA256 2db70e61300086058b1173cbf53e99c43b3df97a5d7b8c2e3614eb1a6f4074e8
MD5 2b6b2bbce4eb5bbf69b80638dc08a065
BLAKE2b-256 196547ab15f19b9e4013d2ba1d333125a1e2e8475cfdfea14b96f46221801f82

See more details on using hashes here.

File details

Details for the file smartsheet_pydantic-1.4.4-py3-none-any.whl.

File metadata

File hashes

Hashes for smartsheet_pydantic-1.4.4-py3-none-any.whl
Algorithm Hash digest
SHA256 5f8e47be11b98f8054ad8d82073afeb0df1c52c53efbd373896b81f0c1adb152
MD5 48810d18e4f18d78ebf076c395d20836
BLAKE2b-256 5f198ea2a514dadae9fa687a317bb5046d81dc9c862fcfa285168dc660b88817

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page