Skip to main content

a python package to split machine learning data sets using graph partitioning

Project description

mlNODS

split machine learning data sets using graph partioning

Appropriate assessments require appropriate splits of training and evaluation data sets and this in turn requires clustering. For many problems single-linkage clustering suffices toward this end. Encountering a problem that could not be solved by such a standard procedure, we developed a simple graph-based tool for the creation of unique data sets.

mlNODS is a graph-based method which allows to split original data sets into non-overlapping sets that cannot be grouped without removing some of the data. mlNODS optimizes the following constraints: (1) retain as many data points as possible, and (2) remove any overlap between two split sets. The nodes of our graph are the original data points, and the connections are measures for the similarity between nodes (e.g. sequence similarity for protein sets). The method begins by building the full graph and proceeds by removing nodes in order to optimally fit the constraints to the similarity table. mlNODS is applicable to any prob- lem and has the additional benefit of allowing overlap within one set (i.e. training on homologues) while it is disal- lowed between two sets (i.e. training and testing do not overlap).

usage: mlnods [-h] -s SPLITS -c CUTOFF [-l LIMIT] -e EDGES_FILE
                [-f EDGES_FORMAT] -n NODES_FILE [-a] [-r RANDOM]
                [-o OUTFOLDER] [-v] [-q] [--version]

This is a script that will create independent sets of data

Version: 1.0 [03/14/20]

optional arguments:
  -h, --help            show this help message and exit
  -s SPLITS, --splits SPLITS
                        number of splits required
  -c CUTOFF, --cutoff CUTOFF
                        similarity cutoff in the units of link scores
  -l LIMIT, --limit LIMIT
                        limit on the number of links for each node (default=0, infinity)
  -e EDGES_FILE, --edges EDGES_FILE
                        file containing a table of instances with link scores for each pair
  -f EDGES_FORMAT, --format EDGES_FORMAT
                        format of the table file

                        blast     : takes a list of -m 9 formated blast files and builds a table based on seqID
                        hssp      : takes a list of -m 9 formated blast files, runs HSSP scoring script and builds an HSSP distance table
                        self<int> : space/tab separated table file, similarity score in column <int>
                                    eg "ID1 ID2 similarity_score" will be addressed as self3 (default=self5)
  -n NODES_FILE, --nodes NODES_FILE
                        instance file containing IDs of all instances being considered

                        IDs are case-independent (eg ABC = abc)
                        IDs are always preceeded by ">" and followed by a white space.
                        No white spaces are allowed in an ID.
                        If score is provided for an ID, it should be surrounded by spaces and directly follow the ID
                        (eg. >abl1_human 10 gene associated with ....)
                        Everything between two IDs is printed in the junction files, but not considered in evaluation
  -a, --abundance       the option to score

                        false : score retrieved from instance file, range [0-100], default=50 when missing
                        true  : score approximated by actual number of times an ID appears in the instance file
  -r RANDOM, --random RANDOM
                        set a fixed random seed to generate consistent partitions
  -o OUTFOLDER, --outfolder OUTFOLDER
                        path to output folder (default=<current directory>
  -v, --verbose         set verbosity level
  -q, --quiet           no logging to stdout
  --version             show program's version number and exit

If an ID is present in the instance file, but not in the table file the ID is considered to not be linked to anything else
If an ID is present in the table file but not in the instance file, it is ignored

mlnods was developed by Yana Bromberg and refactored by Maximilian Miller.

Feel free to contact us for support at services@bromberglab.org.
This software is licensed under [NPOSL-3.0](http://opensource.org/licenses/NPOSL-3.0)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mlnods-1.3.tar.gz (25.6 kB view details)

Uploaded Source

Built Distribution

mlnods-1.3-py3-none-any.whl (26.5 kB view details)

Uploaded Python 3

File details

Details for the file mlnods-1.3.tar.gz.

File metadata

  • Download URL: mlnods-1.3.tar.gz
  • Upload date:
  • Size: 25.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.0.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.1

File hashes

Hashes for mlnods-1.3.tar.gz
Algorithm Hash digest
SHA256 3b8731bc1ad9988247c943ff67b4bf7fef52fb1daa128719713fecbaf10912e4
MD5 3e751834afd1181a9b4a6d826b956f3b
BLAKE2b-256 4729518cb64b5223d9b0cec66252580d3a855ada41191d101e2a2544f890c2bb

See more details on using hashes here.

File details

Details for the file mlnods-1.3-py3-none-any.whl.

File metadata

  • Download URL: mlnods-1.3-py3-none-any.whl
  • Upload date:
  • Size: 26.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.0.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.1

File hashes

Hashes for mlnods-1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 d105b8b30530e2f7b26cd4352e7f709b36bee5ae2665519f459889f6b53658e0
MD5 15370650dea42381a1aee84583402d64
BLAKE2b-256 2b1d3cc45844834e0d944ff5a43f08999b383fa5c013fde87d08ae67f6392ad6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page