A collection of the Apache Spark stub files
For Spark 2.4 and 3.0, development of this package will be continued, until their official deprecation.
- If your problem is specific to Spark 2.3 and 3.0 feel free to create an issue or open pull requests here.
- Otherwise, please check the official Spark JIRA and contributing guidelines. If you create a JIRA ticket or Spark PR related to type hints, please ping me with [~zero323] or @zero323 respectively. Thanks in advance.
Static error detection (see SPARK-20631)
Installation and usage
Please note that the guidelines for distribution of type information is still work in progress (PEP 561 - Distributing and Packaging Type Information). Currently installation script overlays existing Spark installations (pyi stub files are copied next to their py counterparts in the PySpark installation directory). If this approach is not acceptable you can add stub files to the search path manually.
According to PEP 484:
Third-party stub packages can use any location for stub storage. Type checkers should search for them using PYTHONPATH.
Third-party stub packages can use any location for stub storage. Type checkers should search for them using PYTHONPATH. A default fallback directory that is always checked is shared/typehints/python3.5/ (or 3.6, etc.)
Please check usage before proceeding.
The package is available on PYPI:
pip install pyspark-stubs
conda install -c conda-forge pyspark-stubs
|Atom||✔ ||✔ ||Through plugins.|
|IPython / Jupyter Notebook||✘ ||✔|
|VIM / Neovim||✔ ||✔ ||Through plugins.|
|Visual Studio Code||✔ ||✔ ||Completion with plugin|
|Environment independent / other editors||✔ ||✔ ||Through Mypy and Jedi.|
This package is tested against MyPy development branch and in rare cases (primarily important upstrean bugfixes), is not compatible with the preceding MyPy release.
PySpark Version Compatibility
Package versions follow PySpark versions with exception to maintenance releases - i.e. pyspark-stubs==2.3.0 should be compatible with pyspark>=2.3.0,<2.4.0. Maintenance releases (post1, post2, …, postN) are reserved for internal annotations updates.
As of release 2.4.0 most of the public API is covered. For details please check API coverage document.
|||Not supported or tested.|
|||Requires atom-mypy or equivalent.|
|||Requires autocomplete-python-jedi or equivalent.|
|||It is possible to use magics to type check directly in the notebook. In general though, you’ll have to export whole notebook to .py file and run type checker on the result.|
|||Requires PyDev 7.0.3 or later.|
|||TODO Using vim-mypy, syntastic or Neomake.|
|||With Mypy linter.|
|||With Python extension for Visual Studio Code.|
|||Just use your favorite checker directly, optionally combined with tool like entr.|
|||See Jedi editor plugins list.|
Release history Release notifications | RSS feed
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Hashes for pyspark-stubs-3.0.0.post3.tar.gz
Hashes for pyspark_stubs-3.0.0.post3-py3-none-any.whl