Skip to main content

utility for generating mock data sets as pandas dataframes

Project description

DFMock

pandas dataframe mocker for python3.

Why?

Sometimes you just need mock data that is already in pandas datatypes. There are lots of data gen tools out there, but most produce CSVs or some such that needs to be consumed by pandas; this adds a volitile point to your tests. What if your logic is correct, but the datatypes didn't import correctly from the CSV? So DFMock aims to generate mock dataframes where the datatypes are controlled.

Installation

Use pip

pip install dfmock

You can also get the source here

The API

using is simple. import into your module, select your column names and data types, set the number of columns, and generate your frame. from dfmock import DFMock

columns = { "hamburger":"string",
            "hot_dog":"integer",
            "shoelace":"timedelta"
          }
dfmock = DFMock(count=100, cols=columns)
dfmock.generate_dataframe()

my_mocked_dataframe = dfmock.dataframe

Pandas data types supported:

PANDAS TYPE DICT VALUE
object string
int64 int
float64 float
bool bool
datetime64 datetime
timedelta timedelta
category category

so to make a dataframe with column "banana" that is an int64 and "rockstar" that is a category type:

dfmock.cols = {"banana":"int","rockstar":"category"}

pretty simple.

Grow to Size

sometimes you need your dataset to be a certain memory size instead of row size. The grow_dataframe_to_size() allows you to grow a frame until it reaches or passes the given size (in MB). Need a 10MB dataframe?

dfmock.generate_dataframe()
dfmock.size
## returns 0.2 MB

dfmock.grow_dataframe_to_size(10)
dfmock.size
## returns ~10 MB

Grouping

NOTE: timedelta and category datatypes are not currently supported. Sometimes you need a column you can aggregate on with a given data type. For example, you may want 1M rows with one of 4 datetime values (maybe representing 4 observation reporting timestamps?). You can do this using grouping. a grouped column is declared by passing a dict as the data type value with the params for the grouped column as keys. Like this:

columns = {"amazing_grouped_column_with_3_values": { "option_count":3, "option_type":"string"}}

This will give you a column with only 3 distinct values and (nearly) equal distribution. If you need to control the distribution you can pass the histogram argument. This is useful if, for instance, you want a dataset with 4 datetime values and want one value for 50% of records, another for 30%, and the remaining values 20%. You would delcare this distribution like so:

columns = {"super_cool_grouped_column_with_histogram": {"option_count":4, "option_type":"datetime","histogram":(5,3,2,2,)}}

The integers you pass histogram need to add up to 10 with the same number of values as option count (ie. if option_count = 5, you need 5 values). This caps option count at 10 currently for histogram. Your desired row count must also be divisible by 10 - this keeps the math simple.

Contribution

We welcome pull requests!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

dfmock-0.0.14-py3-none-any.whl (5.7 kB view details)

Uploaded Python 3

File details

Details for the file dfmock-0.0.14-py3-none-any.whl.

File metadata

  • Download URL: dfmock-0.0.14-py3-none-any.whl
  • Upload date:
  • Size: 5.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.2

File hashes

Hashes for dfmock-0.0.14-py3-none-any.whl
Algorithm Hash digest
SHA256 739924ca6bfca5afe66b4ceb3e2b46ef0e54ec308d11e0d02218726031701be3
MD5 22c8ce43d05658518afe619217c19f36
BLAKE2b-256 c64f2e1b7f9677d8cb79dede471c244db67613bbbd9210a9db0bd7d45eacad94

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page