Skip to main content

Extraction of fingerprint and palm data from grayscale images.

Project description

Fingerprint and Handprint Extractor

This project uses simple image processing operations to extract finger and hand impressions from white paper sheets. It's intended to be used as an auxiliary tool for forensics research at UNAM.

Getting Started

The latest stable version can be downloaded from the PyPI. No development branches are available as of yet (the code in the repo corresponds to the latest release). No GUI is currently available, but it is expected to be added in the future. Furthermore, higher level analysis on the extracted fingerprints is currently on the works.

Prerequisites

The code is written in Python 3, and it relies on NumPy, OpenCV and SciPy:

The dependencies are automatically managed by pip.

Installing

To download, you can simply create a virtualenv and install the project with pip:

pip install fingerprints-unam-colab

Afterwards, the extraction script can be executed with the following command:

extract_fp path_to_input_image path_to_output_folder

where:

  • path_to_input_image is an absolute or relative path to any image file read by the system.
  • path_to_output_folder is an absolute or relative path to a folder (empty or not).

Inside the output folder, the extract_fp script will create a new sub-folder, named as the input image, where it will put the extracted information.

How it works

The extraction algorithm is quite simple. The process is entirely described in the main function of the extract_fp.py file.

  1. The script reads the input image in grayscale.
  2. A smoothing filter is applied (median Blur with a 13x13 window).
  3. The resulting image is binarized with local thresholding, to account for the expected high contrast between the black hand/finger prints and the white background.
  4. The binary image is inverted (ROI were taken to zero on step 3).
  5. A dilation operation is applied (10x with 8x8 structuring element) to connect contiguous regions.
  6. An erosion operation is performed (5x with 8x8 structuring element) to avoid over segmentation.
  7. ROI are segmented into independent connected components (8x8 neighborhood).
  8. Components below 10000 pixels are discarded (too small).
  9. The remaining components are classified as follows:
    • The two largest components are classified as (independent) handprints.
    • Any remaining components approximating a quadrangle are classified as a fingerprints.
    • Any remaining components with their centroid near any of the handprints are classified as part of said handprint.
      • Proximity is determined by measuring the distance between the centroid of each selected handprint and the farthest point in their respective perimeters. Any contiguous component with their centroid inside the induced circle is considered to be "near" (thus, part of the handprint).
  10. The classified components are cropped, rotated (based on their minimum bounding box) and saved as independent images.

Note: Most of the logic to manage connected components as independent regions is enconded in a Region class (in regions.py).

Authors

  • Arturo Curiel - Initial work - website

See also the list of contributors who participated in this project.

License

This project is licensed under the GNU/GPL3 License - see the LICENSE.md file for details

Acknowledgments

  • Idk, my cat I guess.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fingerprints-unam-colab-0.10.tar.gz (21.1 kB view hashes)

Uploaded Source

Built Distribution

fingerprints_unam_colab-0.10-py3-none-any.whl (22.2 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page