Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Advancements in Synthetic Generation for Contactless Palmprint Biometrics Using StyleGan2-ADA and StyleGAN3

Version 1 : Received: 20 December 2023 / Approved: 21 December 2023 / Online: 21 December 2023 (14:04:58 CET)

How to cite: Chowdhury, A. M. M.; Khondkar, M. J. A.; Imtiaz, M. H. Advancements in Synthetic Generation for Contactless Palmprint Biometrics Using StyleGan2-ADA and StyleGAN3. Preprints 2023, 2023121655. https://doi.org/10.20944/preprints202312.1655.v1 Chowdhury, A. M. M.; Khondkar, M. J. A.; Imtiaz, M. H. Advancements in Synthetic Generation for Contactless Palmprint Biometrics Using StyleGan2-ADA and StyleGAN3. Preprints 2023, 2023121655. https://doi.org/10.20944/preprints202312.1655.v1

Abstract

In addressing the challenges of data scarcity in biometrics, this study explores the generation of synthetic palmprint images as an efficient, cost effective, and privacy preserving alternative to real-world data reliance. Traditional methods for synthetic biometric image creation primarily involve orientation modifications and filter applications, with no established method specific to palmprints. We introduced the utilization of the “Style-based generator”, StyleGAN2-ADA, from the StyleGAN series, renowned for generating high-quality images. Furthermore, we explore the capabilities of its successor, StyleGAN3, which boasts enhanced image generation, facilitating smooth and realistic transitions. By comparing the performance of StyleGAN3 on public dataset, we aim to establish the most efficient generative model for this purpose. Evaluations were conducted using the SIFT (Scale-Invariant Feature Transform) algorithm into our evaluation framework. Preliminary findings suggest that StyleGAN3 offers superior generative capabilities, enhancing equivariance in synthetic palmprint generation.

Keywords

Palm print synthesis; Style-GAN-2-ADA; StyleGAN3; SIFT

Subject

Engineering, Electrical and Electronic Engineering

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.