This week’s BEACON Researchers at Work post is by North Carolina A&T State University graduate student Aniesha Alford.
We all remember the events that occurred on September 11, 2001. I was a sophomore in high school, and I can still vividly recall watching the news along with my classmates as the events unfolded. Since that day, measures are continuously being taken to avoid similar tragedies. One preventive measure is a more secure identification process, specifically via the use of biometric recognition. To date, biometric recognition systems are currently used by a number of commercial and government organizations. However, there is always room for improvement. I perform research with the Center for Advanced Studies in Identity Sciences (CASIS), and my research is in a new field of study that we call Genetic and Evolutionary Biometrics (GEB). GEB is devoted to the discovery, design, and analysis of evolution-based methods for solving traditional problems within the field of biometrics. My research, in particular, focuses on the development of GEB applications to improve the performance of facial and periocular (i.e. the area around the eyes) biometric recognition.
Biometric recognition involves the use of distinct physical, chemical, and/or behavioral characteristics for automatic recognition of an individual. Examples of such biometric characteristics (also known as modalities) include the face, fingerprint, voice, and signatures. These recognition systems typically work by first using a sensor (such as a camera) to acquire a biometric sample. The newly acquired sample is then passed to a feature extractor which transforms the acquired sample into a set of unique features referred to as a feature template. Often, feature selection techniques are then applied to reduce the dimensionality of the resulting feature templates. Next, the reduced template is compared to those previously enrolled (stored) in a database. The similarity between the recently acquired and enrolled templates are then measured and used to make a decision (accept/reject an individual).
In the biometrics community, feature selection techniques have typically focused on retaining the most salient individual features (i.e. the most variant individual dimensions, the most consistent individual features, or the most discriminative individual features). However, my research proposes the use of GECs to: (a) evolve subsets of the most salient combinations of features and/or (b) weight features based on their discriminatory ability in an effort to increase accuracy while decreasing the overall number of features needed for recognition.
Three techniques have been developed and applied for facial and periocular recognition: Genetic & Evolutionary Feature Selection (GEFeS), Weighting (GEFeW), and Weighting/Selection (GEFeWS). GEFeS reduces the number of features used by evolving a feature mask (FM) that discards features that do not aid in increasing the recognition accuracy. On the contrary, GEFeW evolves a weight for each feature within a feature template based on its relevance. Our final technique, GEFeWS, is a hybrid of GEFeS and GEFeW. GEFeWS evolves a FM that discards those features that are not relevant and weights those features which are.
To test the effectiveness of these techniques, images were selected from the Face Recognition Grand Challenge (FRGC) database. Two feature extraction techniques were then applied to the facial and periocular images: the Eigenface method and the Local Binary Patterns (LBP) method. The Eigenface method, which is based on Principle Component Analysis, is a statistical dimensionality reduction technique that is used to extract only those dimensions that are necessary to efficiently distinguish images of individuals. The LBP method is a texture analysis technique which works by first segmenting an image into a grid of evenly sized regions (referred to as patches) and then analyzing the intensity changes of the pixels within each patch. GEFeS, GEFeW, and GEFeWS were then used to evolve FMs for the face-only, periocular-only, and face + periocular feature templates. The performances of these techniques were compared to performance of the feature templates without the use of GECs.
Our results showed that by fusing the periocular biometric with the face, we could achieve higher recognition accuracies than using the two biometric modalities independently. In addition, the LBP feature templates outperformed the Eigenface templates. Our results also showed that our GECs were able to achieve higher recognition rates than the baseline methods (i.e. the feature templates without the use of GECs), while using significantly fewer features. Of the three techniques, GEFeWS performed best, using less than 50% of the extracted features to achieve higher accuracies than GEFeS and GEFeW alone.
In conclusion, I am very excited about our research. It is great to be one of the pioneers in this new field of study, but it is even greater to think that one day our research could be implemented to make biometric security processes more accurate, faster, and more efficient. In addition, the potential that similar techniques may have in other areas of study are astounding. By presenting my research at conferences, I have been approached by several individuals interested in applying similar techniques to their research (i.e. tomato classification). I look forward to seeing how the skills I have gained through this research will come into play in the future. The possibilities seem endless and I believe that BEACON has prepared me for the challenge!
For more information about Aniesha’s research, you can contact her at aalford at ncat dot edu.