Skip to main content

A command line tool for working with JSON documents on local disc

Project description

py_dataset DOI

py_dataset is a Python wrapper for the dataset libdataset a C shared library for working with JSON objects as collections. Collections can be stored on disc or in Cloud Storage. JSON objects are stored in collections using a pairtree as plain UTF-8 text files. This means the objects can be accessed with common Unix text processing tools as well as most programming languages.

This package wraps all dataset operations such as initialization of collections, creation, reading, updating and deleting JSON objects in the collection. Some of its enhanced features include the ability to generate data frames as well as the ability to import and export JSON objects to and from CSV files.

py_dataset is release under a BSD style license.

Features

dataset supports

  • Basic storage actions (create, read, update and delete)
  • listing of collection keys (including filtering and sorting)
  • import/export of CSV files.
  • The ability to reshape data by performing simple object join
  • The ability to create data frames from collections based on keys lists and dot paths into the JSON objects stored

See docs for detials.

Limitations of dataset

dataset has many limitations, some are listed below

  • it is not a multi-process, multi-user data store (it's files on "disc" without locking)
  • it is not a replacement for a repository management system
  • it is not a general purpose database system
  • it does not supply version control on collections or objects

Install

Available via pip pip install py_dataset or by downloading this repo and typing python setup.py install. This repo includes dataset shared C libraries compiled for Windows, Mac, and Linux and the appripriate library will be used automatically.

Quick Tutorial

This module provides the functionality of the dataset command line tool as a Python 3.8 module. Once installed try out the following commands to see if everything is in order (or to get familier with dataset).

The "#" comments don't have to be typed in, they are there to explain the commands as your type them. Start the tour by launching Python3 in interactive mode.

    python3

Then run the following Python commands.

    from py_dataset import dataset
    # Almost all the commands require the collection_name as first paramter, 
    # we're storing that name in c_name for convienence.
    c_name = "a_tour_of_dataset.ds"

    # Let's create our a dataset collection. We use the method called 
    # 'init' it returns True on success or False otherwise.
    dataset.init(c_name)

    # Let's check to see if our collection to exists, True it exists
    # False if it doesn't.
    dataset.status(c_name)

    # Let's count the records in our collection (should be zero)
    cnt = dataset.count(c_name)
    print(cnt)

    # Let's read all the keys in the collection (should be an empty list)
    keys = dataset.keys(c_name)
    print(keys)

    # Now let's add a record to our collection. To create a record we need to know
    # this collection name (e.g. c_name), the key (most be string) and have a 
    # record (i.e. a dict literal or variable)
    key = "one"
    record = {"one": 1}
    # If create returns False, we can check the last error message 
    # with the 'error_message' method
    if not dataset.create(c_name, key, record):
        print(dataset.error_message())

    # Let's count and list the keys in our collection, we should see a count of '1' and a key of 'one'
    dataset.count(c_name)
    keys = dataset.keys(c_name)
    print(keys)

    # We can read the record we stored using the 'read' method.
    new_record, err = dataset.read(c_name, key)
    if err != '':
        print(err)
    else:
        print(new_record)

    # Let's modify new_record and update the record in our collection
    new_record["two"] = 2
    if not dataset.update(c_name, key, new_record):
        print(dataset.error_message())

    # Let's print out the record we stored using read method
    # read returns a touple so we're printing the first one.
    print(dataset.read(c_name, key)[0])

    # Finally we can remove (delete) a record from our collection
    if not dataset.delete(c_name, key):
        print(dataset.error_message())

    # We should not have a count of Zero records
    cnt = dataset.count(c_name)
    print(cnt)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

py_dataset-1.0.1.tar.gz (8.0 MB view details)

Uploaded Source

Built Distribution

py_dataset-1.0.1-py3-none-any.whl (8.0 MB view details)

Uploaded Python 3

File details

Details for the file py_dataset-1.0.1.tar.gz.

File metadata

  • Download URL: py_dataset-1.0.1.tar.gz
  • Upload date:
  • Size: 8.0 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.61.2 CPython/3.9.6

File hashes

Hashes for py_dataset-1.0.1.tar.gz
Algorithm Hash digest
SHA256 7409c4a854a0896262ec1716029563ac87e0305c903038640598aeddb3590bd7
MD5 bf9116465c16f59ce0681f9dab560a63
BLAKE2b-256 6a4338d89ac3a036e4aabf265e8f2cdc7f832b7d4d03e957317c16fe3818140f

See more details on using hashes here.

File details

Details for the file py_dataset-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: py_dataset-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 8.0 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.61.2 CPython/3.9.6

File hashes

Hashes for py_dataset-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 b6f91e64d387a1eb79c46268593f5e936a9fd5cc36e4c17778945d1e4baaf841
MD5 61defa107fb54c77ce6f061628b6f704
BLAKE2b-256 ca3a779b3ffd261ff21932b770e698ee3fefe1f563b6583d03fe1880abcb73de

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page