Skip to main content

sqldf for pandas

Project description

Build Status PyPI Version PyPI Monthly Downloads PyPI License

pysqldf allows you to query pandas DataFrames using SQL syntax. It works similarly to sqldf in R. pysqldf seeks to provide a more familiar way of manipulating and cleaning data for people new to Python or pandas.

Installation

$ pip install pysqldf

Basics

The main class in pysqldf is SQLDF. SQLDF accepts 1 enviroment variable sets or more parametrs in constructor. - an set of session/environment variables (dictionary of valiables, locals() or globals()) - temporary file type - user defined functions - user defined aggregate functions

pysqldf uses SQLite syntax. Any convertable data to pandas DataFrames will be automatically detected by pysqldf. You can query them as you would any regular SQL table.

$ python
>>> from pysqldf import SQLDF, load_meat, load_births
>>> sqldf = SQLDF(globals())
>>> meat = load_meat()
>>> births = load_births()
>>> print sqldf.execute("SELECT * FROM meat LIMIT 10;").head()
                  date  beef  veal  pork  lamb_and_mutton broilers other_chicken turkey
0  1944-01-01 00:00:00   751    85  1280               89     None          None   None
1  1944-02-01 00:00:00   713    77  1169               72     None          None   None
2  1944-03-01 00:00:00   741    90  1128               75     None          None   None
3  1944-04-01 00:00:00   650    89   978               66     None          None   None
4  1944-05-01 00:00:00   681   106  1029               78     None          None   None

>>> q = "SELECT m.date, m.beef, b.births FROM meat m INNER JOIN births b ON m.date = b.date;"
>>> print sqldf.execute(q).head()
                    date    beef  births
403  2012-07-01 00:00:00  2200.8  368450
404  2012-08-01 00:00:00  2367.5  359554
405  2012-09-01 00:00:00  2016.0  361922
406  2012-10-01 00:00:00  2343.7  347625
407  2012-11-01 00:00:00  2206.6  320195

>>> q = "SELECT strftime('%Y', date) AS year, SUM(beef) AS beef_total FROM meat GROUP BY year;"
>>> print sqldf.execute(q).head()
   year  beef_total
0  1944        8801
1  1945        9936
2  1946        9010
3  1947       10096
4  1948        8766

user defined functions and user defined aggregate functions also supported.

$ python
>>> from pysqldf import SQLDF, load_iris
>>> import math
>>> import numpy
>>> ceil = lambda x: math.ceil(x)
>>> udfs = { "ceil": lambda x: math.ceil(x) }
>>> udafs = { "variance": lambda values: numpy.var(values) }
>>> # or you can also define aggregation function as class
>>> # class variance(object):
... #     def __init__(self):
... #         self.a = []
... #     def step(self, x):
... #         self.a.append(x)
... #     def finalize(self):
... #         return numpy.var(self.a)
...
>>> # udafs={ "variance": variance }
>>> iris = load_iris()
>>> sqldf = SQLDF(globals(), udfs=udfs, udafs=udafs)
>>> sqldf.execute("""
    SELECT
        ceil(sepal_length) AS sepal_length,
        ceil(sepal_width) AS sepal_width,
        ceil(petal_length) AS petal_length,
        ceil(petal_width) AS petal_width,
        species
    FROM iris;
    """).head()
   sepal_length  sepal_width  petal_length  petal_width      species
0             6            4             2            1  Iris-setosa
1             5            3             2            1  Iris-setosa
2             5            4             2            1  Iris-setosa
3             5            4             2            1  Iris-setosa
4             5            4             2            1  Iris-setosa
>>> sqldf.execute("SELECT species, variance(sepal_width) AS var FROM iris GROUP BY species;")
           species       var
0      Iris-setosa  0.142276
1  Iris-versicolor  0.096500
2   Iris-virginica  0.101924

Documents

SQLDF(env, inmemory=True, udfs={}, udafs={})

env: variable mapping dictionary of sql executed enviroment. key is sql variable name and value is your program variable. locals() or globals() is used for simple assign.

inmemory: sqlite db option.

udfs: dictionary of user defined functions. dictionary key is function name, dictionary value is function. see sqlite3 document

udafs: dictionary of user defined aggregate functions. dictionary key is function name, dictionary value is aggregate function or class. If value is function, function gets one argument that is list of column values and it should return aggregated a value. Another case(value is class), see sqlite3 document.

load_iris(), load_meat(), load_births()

load example DataFrame data.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pysqldf-1.2.2.tar.gz (30.7 kB view details)

Uploaded Source

Built Distributions

pysqldf-1.2.2-py3-none-any.whl (27.4 kB view details)

Uploaded Python 3

pysqldf-1.2.2-py2.py3-none-any.whl (27.4 kB view details)

Uploaded Python 2Python 3

File details

Details for the file pysqldf-1.2.2.tar.gz.

File metadata

  • Download URL: pysqldf-1.2.2.tar.gz
  • Upload date:
  • Size: 30.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for pysqldf-1.2.2.tar.gz
Algorithm Hash digest
SHA256 c20cfca30f54549e8f5e1201f4d5ddcefd518bd6a734d4f331de0a1701a2cd5f
MD5 31e6b62e10bbf57e1a1d51cd31ea09d7
BLAKE2b-256 57552cbacbf55832b99f1207681531ed7a88df663d6df5d2be61ed2aea9c44ee

See more details on using hashes here.

File details

Details for the file pysqldf-1.2.2-py3-none-any.whl.

File metadata

File hashes

Hashes for pysqldf-1.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 147b0bdbe4fb9385185e9e77acdce18976cd399702443d6595f3b57d75894b35
MD5 a6a7b8d7091ca4c24ed800bc25e607f6
BLAKE2b-256 6abdb1462a7bd4a8f2bd208f19dc35479b8526f1f693db30f64f4af4ea53f081

See more details on using hashes here.

File details

Details for the file pysqldf-1.2.2-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for pysqldf-1.2.2-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 3c303303eeed8b83149853a8a86324eb231ac2382439a601ba904b72c45184db
MD5 a7d1310eda02e350564788e26ed4d497
BLAKE2b-256 1a763fe9842339d9c26904761c668662930152f79c046b7882ef056eb4acfe37

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page