Skip to main content

A formatter for Python code and SparkSQL queries.

Project description

pyspark-sql-formatter

A formatter for Pyspark code with SQL queries. It relies on Python formatter yapf and SparkSQL formatter sparksqlformatter, both working indepdendently. User can specify configurations for either formatter separately.

Installation

Install using pip

pip install pysqlformatter

Install from source

  1. Download source code.
  2. Navigate to the source code directory.
  3. Do python setup.py install or pip install ..

Compatibility

Supports Python 2.7 and 3.6+.

Usage

pysqlformatter can be used as either a command-line tool or a Python library.

Use as command-line tool

usage: pysqlformatter [-h] [-f FILES [FILES ...]] [-i] [--query-names QUERY_NAMES [QUERY_NAMES ...]] [--python-style PYTHON_STYLE] [--sparksql-style SPARKSQL_CONFIG]

Formatter for Pyspark code and SparkSQL queries.

optional arguments:
  -h, --help            show this help message and exit
  -f FILES [FILES ...], --files FILES [FILES ...]
                        Paths to files to format.
  -i, --in-place        Format the files in place.
  --python-style PYTHON_STYLE
                        Style for Python formatting, interface to https://github.com/google/yapf.
  --sparksql-style SPARKSQL_CONFIG
                        Style for SparkSQL formatting, interface to https://github.com/largecats/sparksql-formatter.
  --query-names QUERY_NAMES [QUERY_NAMES ...]
                        String variables with names containing these strings will be formatted as SQL queries. Default to 'query'.

E.g.,

$ pysqlformatter -f <path_to_file> --python-style='pep8' --sparksql-style="{'reservedKeywordUppercase': False}" --query-names query

Or using config files:

$ pysqlformatter -f <path_to_file> --python-style="<path_to_python_style_config_file>" --sparksql-style="<path_to_sparksql_config_file>" --query-names query

Use as Python library

Call pysqlformatter.api.format_script() to format script passed as string:

>>> from pysqlformatter import api
>>> script = '''query = 'select * from t0'\nspark.sql(query)'''
>>> api.format_script(script=script, pythonStyle='pep8', sparksqlConfig=sparksqlConfig(), queryNames=['query'])
"query = '''\nSELECT\n    *\nFROM\n    t0\n'''\nspark.sql(query)\n"

Call pysqlformatter.api.format_file() to format script in file:

>>> from pysqlformatter import api
>>> api.format_file(filePath=<path_to_file>, pythonStyle='pep8', sparksqlConfig=sparksqlConfig(), queryNames=['query'], inPlace=False)
...

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pysqlformatter-0.0.0.tar.gz (10.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pysqlformatter-0.0.0-py2.py3-none-any.whl (13.3 kB view details)

Uploaded Python 2Python 3

File details

Details for the file pysqlformatter-0.0.0.tar.gz.

File metadata

  • Download URL: pysqlformatter-0.0.0.tar.gz
  • Upload date:
  • Size: 10.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/47.3.1 requests-toolbelt/0.9.1 tqdm/4.46.1 CPython/3.8.3

File hashes

Hashes for pysqlformatter-0.0.0.tar.gz
Algorithm Hash digest
SHA256 99a6855d27199f0092726593a2e2a1df12a15aba0884c853abfe1a431f862ae9
MD5 e0969c1228d5ffaba16c4ac882f5baa3
BLAKE2b-256 9fa7447571ab8d6f87b5c4cadef44fa1fb124c7e5ff41af2a33aaabd66565850

See more details on using hashes here.

File details

Details for the file pysqlformatter-0.0.0-py2.py3-none-any.whl.

File metadata

  • Download URL: pysqlformatter-0.0.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 13.3 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/47.3.1 requests-toolbelt/0.9.1 tqdm/4.46.1 CPython/3.8.3

File hashes

Hashes for pysqlformatter-0.0.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 62d8621afa5359e4e4049b53154a2d706d092f60c51e2e15686c81c91fdfec4d
MD5 3687c62746e9362f53ad80b4fa5eda31
BLAKE2b-256 701df48059b396b6bc1de27caf7c28d89ab334c9ee8080e06b36397c53e97351

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page