Skip to main content

Software package to reproduce experiments of Chapter 18, "An Introduction to Vein Presentation Attacks and Detection" of the "Handbook of Biometric Anti-Spoofing: Presentation Attack Detection 2nd Edition" book

Project description

An Introduction to Vein Presentation Attacks and Detection

This package is part of the signal-processing and machine learning toolbox Bob. It reproduces results presented in Chapter 18 of the “Handbook of Biometric Anti- Spoofing: Presentation Attack Detection 2nd Edition”.


We use conda to manage the software stack installation. To install this package and all dependencies, first install conda, and then run the following command on your shell:

$ conda create --name hobpad2-veins --override-channels -c -c defaults python=3 bob.hobpad2.veins
$ conda activate hobpad2-veins
(hobpad2-veins) $ #type all commands inside this "activated" environment

This will install all the required software to reproduce this paper.

You now need to procure the raw data to run the experiments. Visit the web page for the VERA FingerVein and click on the “Download” link on that page. Follow the instructions to sign the End-User License Agreement (EULA) and then you’ll be able to access the package with the data. Download and uncompress it. Annotate the directory leading to the root of the dataset. In this guide, we’ll refer to it using a generic path /path/to/verafinger.

Once the data is downloaded and uncompressed, configure Bob’s access to it by editing (or creating) a file called ${HOME}/.bob_bio_databases.txt and set the path to the dataset there. Here is an example:

(hobpad2-veins) $ cat ~/.bob_bio_databases.txt
[YOUR_VERAFINGER_DIRECTORY] = /path/to/verafinger

You’re now all set. Proceed to the instructions to re-run algorithms and analysis routines available in this package.

Re-running Vulnerability Analysis

To re-run results for vulnerability analysis, first run the baseline experiments based on Maximum Curvature and Miura Matching using the Vera Fingervein “Nom” Protocol:

(hobpad2-veins) $ mc_bio
(hobpad2-veins) $ mc_pa

This should take about 2 hours to complete in a modest computer. If you have more cores available, it is adviseable to run the jobs in parallel. Just execute it like this instead:

(hobpad2-veins) $ mc_bio parallel

Results are stored in the results folder, which is created if it doesn’t already exist. Then, run the analysis script to produce the vulnerability analysis merit figures for the algorithm/protocol above:

(hobpad2-veins) $ bob vuln hist results/scores/mc/{Nom,Nom-va}/nonorm/scores-dev

The last script should produce a figure like Figure 10 in the book chapter, together with error rates you can quote. Repeat the above analysis for other protocols like the following, for the “Full” protocol:

(hobpad2-veins) $ --protocol='Full' mc_bio
(hobpad2-veins) $ --protocol='Full-va' mc_pa

There is a total of 4 different Vera Fingervein protocols supported: “Full”, “Nom” (the default), “Fifty” and “B”. There is a total of 3 different baseline algorithms that vary only on the feature extraction (binarisation) technique used. All baselines use pre-annotated regions of interest pre-processing and Miura Matching (correlation) for scoring: Maximum Curvature (configuration mc_bio), Repeated Line Tracking (configuration rlt_bio) and Wide-Line Detector (configuration wld_bio). You may refer to the documentation of for details on the implementation. You can list available baselines using the following command:

(hobpad2-veins) $ --types=config --packages=bob.hobpad2.veins

Once you ran all baselines, you should be able to reproduce Table 2 in the book chapter.

For your reference, these are the EER values at the operation point:





















This is Impostor Attack Presentation Match Rate (IAPMR) for thresholds on the EER point for the baseline systems above:





















Re-running Competition Baselines

You can re-run the baseline algorithm for presentation-attack detection provided by Idiap during “The 1st Competition on Counter Measures to Finger Vein Spoofing Attacks”, presented on ICB 2018. To do so, run:

(hobpad2-veins) $ #runs the FFT-based algorithm for the "full" PAD protocol
(hobpad2-veins) $ -vv verafinger-pad fourier
(hobpad2-veins) $ # the same as above for the "cropped" protocol
(hobpad2-veins) $ -vv --protocol=cropped verafinger-pad fourier

The above commands should get your scores in the directory results. You may then run the analysis, which will lead you to results reported in Figure 11 in the book chapter (see the last two plots of each PDF):

(hobpad2-veins) $ bob bio hist --output=idiap-full.pdf --eval results/scores/fourier/full/nonorm/scores-{dev,eval}
(hobpad2-veins) # The next line dumps a table with the calculated metrics
(hobpad2-veins) $ bob bio metrics --eval results/scores/fourier/full/nonorm/scores-{dev,eval}

Some of the results reported rely on scores provided by other teams that participated on the competition. Those scores are loaded from the folder scores available within this package. You can find the directory containing the scores with the following command:

(hobpad2-veins) $
Scores are located at /path/to/bob.hobpad2.veins/bob/hobpad2/veins/scores
(hobpad2-veins) $ ls -l /path/to/bob.hobpad2.veins/bob/hobpad2/veins/scores
total 0
drwxr-xr-x  6 foobar users 204 Apr 16 15:02 pad-competition/
drwxr-xr-x 10 foobar users 340 Apr 16 15:02 mc/
drwxr-xr-x 10 foobar users 340 Apr 16 15:02 rlt/
drwxr-xr-x 10 foobar users 340 Apr 16 15:02 wld/

Results from Table 3 in the book chapter can be reproduced with the following script:

mkdir competition-results

for participant in idiap guc blab grip-priamus; do
  for protocol in full cropped; do
    echo "PAD of '${participant}' on protocol '${protocol}' -> ${logfname}"
    bob bio hist --output=${plotfname} --eval ${SCORES_DIR}/pad-competition/${participant}/${protocol}/scores-{dev,eval}
    bob bio metrics --output=${plotfname} --eval ${SCORES_DIR}/pad-competition/${participant}/${protocol}/scores-{dev,eval}

By looking at all log files generated from the metrics commands, you should be able to reconstruct the data on Table 1 relative to results published alongside the 1st. Competition on Counter Measures to Fingervein Spoofing. Results shown here correspond to the HTER on the test set for a threshold chosen a priori on the development set EER (in parenthesis):







0% ( 0%)

4% (0%)

0% (0%)

0% (0%)


19% (24%)

3% (0%)

1% (0%)

0% (0%)


To improve reproducibility aspects, we include all pre-generated scores after running the systems described above, in the scores folder within this package. The contents are the same as results/scores if you run all variants described above. You may reproduce all analysis described in the book chapter only by using them, instead of running all baseline scripts. To do so, just replace the path on the relevant analysis command-line instructions above from results/scores to /path/to/scores, which may obtained with the script as described above.

The scores sub-directory also contains de-anonymized scores provided by competitors of our paper “The 1st Competition on Counter Measures to Finger Vein Spoofing Attacks”, presented on ICB 2018. You’ll need these to reproduce some of the figures shown in the book chapter. Notice we have not received source code contributions for competitors and therefore cannot reproduce or provide code to reproduce such results.

Extending Functionality

To extend functionality provided by this package, you’ll need to develop new components for either (our biometric vein recognition framework) or bob.pad.vein (our presentation-attack detection framework for vein recognition systems). Start by reading their user guides.


For questions or reporting issues to this software package, contact our development mailing list.

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution (7.1 MB view hashes)

Uploaded source

Supported by

AWS AWS Cloud computing Datadog Datadog Monitoring Fastly Fastly CDN Google Google Object Storage and Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page