An intent to write SQL like programs and to translate them in Python, SQL, Hive, PIG.
Project description
.. _l-README:
README
======
.. only:: html
.. image:: https://travis-ci.org/sdpython/pysqllike.svg?branch=master
:target: https://travis-ci.org/sdpython/pysqllike
:alt: Build status
.. image:: https://ci.appveyor.com/api/projects/status/rrpks1pgivea23js?svg=true
:target: https://ci.appveyor.com/project/sdpython/pysqllike
:alt: Build Status Windows
.. image:: https://circleci.com/gh/sdpython/pysqllike/tree/master.svg?style=svg
:target: https://circleci.com/gh/sdpython/pysqllike/tree/master
.. image:: https://badge.fury.io/py/pysqllike.svg
:target: http://badge.fury.io/py/pysqllike
.. image:: http://img.shields.io/github/issues/sdpython/pysqllike.png
:alt: GitHub Issues
:target: https://github.com/sdpython/pysqllike/issues
.. image:: https://img.shields.io/badge/license-MIT-blue.svg
:alt: MIT License
:target: http://opensource.org/licenses/MIT
.. image:: https://landscape.io/github/sdpython/pysqllike/master/landscape.svg?style=flat
:target: https://landscape.io/github/sdpython/pysqllike/master
:alt: Code Health
.. image:: https://requires.io/github/sdpython/pysqllike/requirements.svg?branch=master
:target: https://requires.io/github/sdpython/pysqllike/requirements/?branch=master
:alt: Requirements Status
.. image:: https://codecov.io/github/sdpython/pysqllike/coverage.svg?branch=master
:target: https://codecov.io/github/sdpython/pysqllike?branch=master
**Links:**
* `GitHub/pysqllike <https://github.com/sdpython/pysqllike>`_
* `documentation <http://www.xavierdupre.fr/app/pysqllike/helpsphinx/index.html>`_
* `Blog <http://www.xavierdupre.fr/app/pysqllike/helpsphinx/blog/main_0000.html#ap-main-0>`_
Description
-----------
Writing a map/reduce job
(using `PIG <https://pig.apache.org/>`_ for example),
usually requires to switch from local files to remote files
(on `Hadoop <http://hadoop.apache.org/>`_).
On way to work is extract a small sample of the data which will be processed
by a map/reduce job. The job is then locally developped. And when it works,
it is run on a parallized environment.
The goal of this extension is allow the implementation of
this job using Python syntax as follows:
::
def myjob(input):
iter = input.select (input.age, input.nom, age2 = input.age2*input.age2)
wher = iter.where( (iter.age > 60).Or(iter.age < 25))
return where
input = IterRow (None, [ {"nom": 10}, {"jean": 40} ] )
output = myjob(input)
When the job is ready, it can be translated into a `PIG <https://pig.apache.org/>`_
job::
input = LOAD '...' USING PigStorage('\t') AS (nom, age);
iter = FOREACH input GENERATE age, nom, age*age AS age2 ;
wher = FILTER iter BY age > 60 or age < 25 ;
STORE wher INTO '...' USING PigStorage();
It should also be translated into
`SQL <http://fr.wikipedia.org/wiki/Structured_Query_Language>`_.
Functionalities
---------------
* not yet ready
Design
------
This project contains various helper about logging functions, unit tests and help generation.
* a source folder: ``src``
* a unit test folder: ``_unittests``, go to this folder and run ``run_unittests.py``
* a _doc folder: ``_doc``, it will contains the documentation
* a file ``setup.py`` to build and to install the module
* a file ``make_help.py`` to build the sphinx documentation
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
pysqllike-0.1.168.tar.gz
(34.3 kB
view hashes)
Built Distribution
Close
Hashes for pysqllike-0.1.168-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | b62aa8fbc9bb49915746eb681211c61f182a4ab9ec7fb690c20dff5d2152f73e |
|
MD5 | 82a061aa9b1a396060899e72698d97a7 |
|
BLAKE2b-256 | 29a43edef89a90b47378e59f4dc076f9b5d7220d659299f432a88bdb0f249840 |