Fast db insert with postgresql binary copy
pgcopy is a small system for very fast bulk insertion of data into a PostgreSQL database table using binary copy.
$ pip install pgcopy
pgcopy provides facility for copying data from an iterable of tuple-like objects using a CopyManager, which must be instantiated with a psycopg2 db connection, the table name, and an iterable containing the names of the columns to be inserted in the order in which they will be provided. pgcopy inspects the database to determine the datatypes of the columns.
from datetime import datetime from pgcopy import CopyManager import psycopg2 cols = ('id', 'timestamp', 'location', 'temperature') now = datetime.now() records = [ (0, now, 'Jerusalem', 72.2), (1, now, 'New York', 75.6), (2, now, 'Moscow', 54.3), ] conn = psycopg2.connect(database='weather_db') mgr = CopyManager(conn, 'measurements_table', cols) mgr.copy(records) # don't forget to commit! conn.commit()
By default, a temporary file on disk is used. If there’s enough memory, you can get a slight performance benefit with in-memory storage:
from io import BytesIO mgr.copy(records, BytesIO)
A db schema can be specified in the table name using dot notation:
mgr = CopyManager(conn, 'myschema.measurements', cols)
Currently the following PostgreSQL datatypes are supported:
- double precision
- timestamp with time zone
- numeric (data must be decimal.Decimal)
Unicode strings in the data to be inserted (all values of type str in Python 3) should be encoded as bytes before passing them to copy. Values intended to be NULL in the database should be encoded as None rather than as empty strings.
PostgreSQL numeric does not support Decimal('Inf') or Decimal('-Inf'). pgcopy serializes these as NaN.
For a fast test run using current environment, use nose:
For more thorough testing, Tox configuration will run tests on python versions 2.7 and 3.3 - 3.6:
Additionally, test can be run with no local requirements other than the ubiquitous docker:
$ docker-compose up pgcopy
Below are simple benchmarks for 100000 records. This gives a general idea of the kind of speedup available with pgcopy:
$ nosetests -c tests/benchmark.cfg ExecuteManyBenchmark: 7.75s PGCopyBenchmark: 0.54s ---------------------------------------------------------------------- Ran 2 tests in 9.101s
Replacing a Table
When possible, faster insertion may be realized by inserting into an empty table with no indices or constraints. In a case where the entire contents of the table can be reinserted, the Replace context manager automates the process. On entry, it creates a new table like the original, with a temporary name. Default column values are included. It provides the temporary name for populating the table within the context. On exit, it recreates the constraints, indices, triggers, and views on the new table, then replaces the old table with the new. It can be used so:
from pgcopy import CopyManager, Replace with Replace(conn, 'mytable') as temp_name: mgr = CopyManager(conn, temp_name, cols) mgr.copy(records)
Replace renames new db objects like the old, where possible. Names of foreign key and check constraints will be mangled. As of v0.6 there is also pgcopy.util.RenameReplace, which instead of dropping the original objects renames them using a transformation function.
Note that on PostgreSQL 9.1 and earlier, concurrent queries on the table will fail once the table is dropped.
cpgcopy, a Cython implementation, about twice as fast.
|date:||14 Feb, 2018|
- Mention commit in the README
|date:||22 Aug, 2017|
- Support unlimited varchar fields (thanks John A. Bachman)
- Updated documentation regarding string encoding in Python 3 (thanks John A. Bachman)
- Fix bug in varchar truncation
- Fix bug in numeric type formatter (reported by Peter Van Eynde)
|date:||25 Mar, 2017|
- Support db schema (thanks Marcin Gozdalik)
|date:||26 Jan, 2017|
- Support uuid, json, and jsonb types (thanks Igor Mastak)
- Integrate Travis CI
- Add docker test strategy
|date:||19 Jan, 2017|
- Run tests with tox
- Support Python 3
- Initial release on PyPi
|date:||19 Jan, 2017|
- Add support for serializing Python decimal.Decimal to PostgreSQL numeric.
|date:||21 Oct, 2014|
- RenameReplace variant
|date:||14 Jul, 2014|
- Support default values and sequences
|date:||14 Jul, 2014|
- Fix Replace utility class bugs
- Add view support to Replace
|date:||8 Jul, 2014|
- Move Cython optimization to separate project
- Add Replace utility class
|date:||7 Jul, 2014|
- Cython optimization
|date:||29 Jun, 2014|
- Initial version
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size & hash SHA256 hash help||File type||Python version||Upload date|
|pgcopy-1.3.1-py2.py3-none-any.whl (13.0 kB) Copy SHA256 hash SHA256||Wheel||py2.py3|
|pgcopy-1.3.1.tar.gz (10.9 kB) Copy SHA256 hash SHA256||Source||None|