US20100329568A1 - Networked Face Recognition System - Google Patents

Networked Face Recognition System Download PDF

Info

Publication number
US20100329568A1
US20100329568A1 US12/919,092 US91909209A US2010329568A1 US 20100329568 A1 US20100329568 A1 US 20100329568A1 US 91909209 A US91909209 A US 91909209A US 2010329568 A1 US2010329568 A1 US 2010329568A1
Authority
US
United States
Prior art keywords
face
image
network
verifier
compliance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/919,092
Inventor
Avihu Meir Gamliel
Shmuel Goldenberg
Felix Tsipis
Yuri Kheifetz
Ester Freitsis
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
C-TRUE Ltd
C True Ltd
Original Assignee
C True Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by C True Ltd filed Critical C True Ltd
Priority to US12/919,092 priority Critical patent/US20100329568A1/en
Assigned to C-TRUE LTD. reassignment C-TRUE LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KHEIFETZ, YURI, TSIPIS, FELIX, GOLDENBERG, SHMUEL, FREITSIS, ESTER, GAMLIEL, AVIHU MEIR
Publication of US20100329568A1 publication Critical patent/US20100329568A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/164Detection; Localisation; Normalisation using holistic features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/94Hardware or software architectures specially adapted for image or video understanding
    • G06V10/95Hardware or software architectures specially adapted for image or video understanding structured as a network, e.g. client-server architectures

Definitions

  • the present invention relates to face authentication and recognition and, more particularly, but not exclusively to a networked system for automatic and remote face authentication and recognition.
  • Identity theft is a criminal fraud that involves someone pretending to be someone else in order to steal money or get other benefits.
  • a person whose identity is used can suffer various consequences when he or she is held responsible for the perpetrator's actions.
  • Identity theft includes, but is not limited to business/commercial identity theft (using another's business name to obtain credit), criminal identity theft (posing as another when apprehended for a crime), financial identity theft (using another's identity to obtain goods and services), identity cloning (using another's information to assume his or her identity in daily life), and medical identity theft (using another's information to obtain medical care, drugs, or access to sensitive medical records).
  • U.S. Pat. No. 7,050,608, to Dobashi, filed on Mar. 7, 2002, entitled “Face image recognition apparatus”, discloses a face image recognition apparatus.
  • Dobashi's face image recognition apparatus includes a registration information holding section in which a reference feature amount of the face of at least one to-be-recognized person is previously registered.
  • the feature amount of the face is extracted from a face image input via an image input section by use of feature amount extracting section.
  • a recognition section determines the recognition rate between the extracted feature amount and the reference feature amount registered in the registration information holding section.
  • a feature amount adding section additionally registers the feature amount extracted by the feature amount extracting section as a new reference feature amount into the registration information holding section when it is determined that the determined recognition rate is lower than a preset value.
  • U.S. Pat. No. 7,221,809, to Geng, filed on Dec. 17, 2002, entitled “Face recognition system and method”, discloses a method of automatically recognizing a human face.
  • the method described by Geng includes developing a three-dimensional model of a face, and generating a number of two-dimensional images based on the three-dimensional model.
  • the generated two-dimensional images are then enrolled in a database and searched against an input image to identifying the face of the input image.
  • Security screening involves capturing images of people in public places and comparing them to images of persons who are known to pose security risks.
  • One prime example of security screening is its use at airport security checkpoints.
  • the system described by Turk includes an imaging system which generates an image of the audience and a selector module for selecting a portion of the generated image. Turk's system further includes a detection means which analyzes the selected image portion to determine whether an image of a person is present, and a recognition module responsive to the detection means for determining whether a detected image of a person identified by the detection means resembles one of a reference set of images of individuals.
  • Chen's system has a first data storing module for storing three dimensional (3D) face model data and two dimensional (2D) face image data, an input unit for inputting 3D face model data and 2D face image data, a signal conversion module for converting analog data of the 3D face model data and 2D face image data to digital data, and a second data storing module for storing the digital data.
  • a first data storing module for storing three dimensional (3D) face model data and two dimensional (2D) face image data
  • an input unit for inputting 3D face model data and 2D face image data
  • a signal conversion module for converting analog data of the 3D face model data and 2D face image data to digital data
  • a second data storing module for storing the digital data.
  • Chen's system further includes a micro-processing module for analyzing geometric characteristics of points in the 3D face model data stored in the first and second data storing module to determine feature points of the 3D face model data, and assigning different weight ratios to feature points.
  • Chen's system further includes a comparison module for comparing the feature points stored in the first and second data storing module, during which different geometric characteristics being given different weight ratios, and calculating relativity between the feature points to obtain a comparison result.
  • a networked system for face recognition comprising: a face verifier, configured to verify compliance of a face in an image with at least one predefined criterion, and upon successful verification of the compliance, to forward the image for feature extraction; a feature extractor, associated with the face verifier, configured to extract a feature from the forwarded image; and a face identifier, communicating with the feature extractor over a network, configured to receive the extracted feature and identify the face in the forwarded image, using the extracted feature.
  • a networked system for face recognition comprising: a face verifier, configured to verify compliance of a face in an image with at least one predefined criterion, and upon successful verification of the compliance, to send the image over a network; a feature extractor, communicating with the face verifier over the network, and configured to receive the sent image and extract a feature from the received image; and a face identifier, associated with the feature extractor and configured to identify the face in the received image, using the extracted feature.
  • a networked system for face recognition comprising: a face verifier, configured to verify compliance of a face in an image with at least one predefined criterion, and upon successful verification of the compliance, to send the image over a first network; a feature extractor, communicating with the face verifier over the first network, configured to receive the sent image, extract a feature from the received image, and sent the extracted feature over a second network; and a face identifier, communicating with the feature extractor over the second network, configured to receive the extracted feature and identify the face in the received image, using the extracted feature.
  • a networked system for face recognition comprising: a face verifier, configured to verify compliance of a face in an image with at least one predefined criterion, and upon successful verification of the compliance, to send data comprising at least a part of the image over a network; and a face identifier, communicating with the face verifier over the network, configured to receive the sent data and identify the face, using at least a part of the received data.
  • a networked system for face recognition comprising: a face verifier, configured to verify compliance of a face in an image with at least one predefined criterion, and upon successful verification of the compliance, to send data comprising at least a part of the image over a network; and a face database updater, communicating with the face verifier over the network, and configured to receive the sent data and update a face database with at least a part of the received data.
  • a networked system for face recognition comprising: a face verifier, configured to verify compliance of a face in an image with at least one predefined criterion, and restrict forwarding of data comprising at least a part of the image over a network, according to results of the verification of the compliance.
  • a method for face recognition comprising: a) verifying compliance of a face in an image with a predefined criterion; b) upon the verifying of the compliance being successful, sending data comprising at least a part of the image over a network, for identification; and c) identifying the face in the image, using at least a part of the sent data.
  • a method for face recognition comprising: a) verifying compliance of a face in an image with a predefined criterion; b) upon the verifying of the compliance being successful, sending data comprising at least a part of the image over a network; and c) updating a database of images with at least a part of the sent data.
  • a method for face recognition comprising: a) verifying compliance of a face in an image with a predefined criterion; and b) controlling forwarding of data comprising at least a part of the image through a network, according to a result of the verifying of the compliance.
  • Implementation of the method and system of the present invention involves performing or completing certain selected tasks or steps manually, automatically, or a combination thereof.
  • selected steps could be implemented by hardware or by software on any operating system of any firmware or a combination thereof.
  • selected steps of the invention could be implemented as a chip or a circuit.
  • selected steps of the invention could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system.
  • selected steps of the method and system of the invention could be described as being performed by a data processor, such as a computing platform for executing a plurality of instructions.
  • FIG. 1 is a block diagram illustrating a first networked system for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating a second networked system for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 3 is a block diagram illustrating a third networked system for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 4 is a block diagram illustrating a fourth networked system for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 5 is a block diagram illustrating a fifth networked system for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 6 is a block diagram illustrating a sixth networked system for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 7 is a block diagram illustrating a seventh networked system for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 8 is a flowchart illustrating a first method for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 9 is a flowchart illustrating a second method for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 10 is a flowchart illustrating a third method for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 11 is a flowchart illustrating a fourth method for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 12 is a flowchart illustrating a fifth method for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 13 is a flowchart illustrating a sixth method for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 14 is a flowchart illustrating a seventh method for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 15 is a flowchart illustrating an eighth method for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 16 illustrates cropping of an image of a face, according to an exemplary embodiment of the present invention.
  • FIGS. 17 a , 17 b , and 17 c illustrate a face recognition scenario, according to an exemplary embodiment of the present invention.
  • the present embodiments comprise a networked system and method for recognizing a face in one or more images (say a still image, a sequence of video images, etc.).
  • the system may be implemented on a wide area network such as the Word Wide Web (i.e. the internet), on an intranet network, etc., as described in further detail herein below.
  • a database of faces of known individuals is used to store images of the faces of the known individuals, or feature extracted from the image (say a biometric stamp), as described in further detail hereinbelow.
  • a face of a user of computer station remote from the database of faces is captured in an image (say by a still video camera).
  • a client module installed on the remote station selectively sends the image (or data which includes one or more features extracted from the image) over the internet, for storage on the remote face database.
  • the face as captured in the image has to comply with a criterion defined in advance, before the captured image is forwarded for storage on the database, as described in further detail hereinbelow.
  • the criterion pertains to a statistical model run over previously received images.
  • the criterion may be based on degree of deviation of the captured image (and thus the face in the image) from an average image, as known in the art.
  • the average image is calculated from the previously received images, using known in the art methods.
  • each pixel's intensity equals an average of intensities of pixels in the same position in the previously received images.
  • the criterion is based on a comparison made between the image and one or more images previously captured from the same user. That is to say that the face of the user in the captured image is compared with the face of the same user, as captured in previously received image(s), or with an average image calculated from previously received images of the same user, which thus bears an average of the face, as described in further detail hereinabove.
  • the criterion is a symmetry criterion.
  • the face as captured in the image may have to be successfully verified as symmetric before the image (or data) is sent, as described in further detail hereinbelow.
  • the symmetry criterion is based on symmetry of a polygon, which connects selected parts of the captured image.
  • the selected parts are known elements of a human face (say nose, eyes, or mouth).
  • the known elements may be identified in the captured image using known in art techniques, such as: Viola-Jones algorithms, Neural Network methods, etc., as known in the art.
  • the centers of the known face elements identified in the captured image are connected to form a polygon, and a verified symmetry of the polygon serves as an indication for the symmetry of the face in the captured image.
  • the centers of the right eye, left eye, and nose, in the captured image may be connected to form a triangle, which is expected to be isosceles, and thus symmetric.
  • a successful verification of the triangle as isosceles indicates that the face captured in the image is indeed symmetric.
  • the centers of the eyes and edges of lips in the captured image may be connected to form a trapezoid, which is expected to be symmetric, etc.
  • the selected parts are segments of the face in the image.
  • the segments are identified in the captured image, using known in the art image segmentation methods, such as Feature Oriented Flood Fill, Texture Analysis, Principal Component Analysis (PCA) based methods, DFT (Discrete Fourier Transform) methods (i.e. harmonic methods), etc., as known in the art.
  • image segmentation methods such as Feature Oriented Flood Fill, Texture Analysis, Principal Component Analysis (PCA) based methods, DFT (Discrete Fourier Transform) methods (i.e. harmonic methods), etc.
  • the mass centers of the selected segments (say segments positioned in parts of the image expected to include known parts of the face, say nose, lips, or mouth) in the captured image are connected to form a polygon.
  • a verified symmetry of the polygon serves as an indication for the symmetry of the face, as described in further detail hereinabove.
  • the symmetry criterion is applied on a map representation of the image.
  • the map representation may include, but is not limited to: an intensity map, a phase map, a texture map (i.e. gradient map), or any other map generated from the image using standard image processing filters, as known in the art.
  • the symmetry criterion may be defined before the images are stored in the face database, as described in further detail hereinbelow.
  • the symmetry criterion is formulated as a threshold value for symmetry, as known in the art.
  • the threshold value may be a theoretical value based on theoretical calculations, an empirical value derived from experimental data, etc., as known in the art.
  • a face in a new image say a face of an individual who uses the computer station and wishes to be granted access to a classified information system
  • the face in the new image is tested with respect to the criterion, say the symmetry of the face, as described in further detail hereinabove. That is to say that the face has to comply with the criterion before an attempt is made at identifying the face, say by attempting to match between the captured image and images in the remote database of faces.
  • a predefined criterion is enforced on all faces identified in images, using the methods and systems taught hereinbelow.
  • the predefined criterion may improve accuracy and efficiency of identification of the face in the image.
  • an individual may be asked to have his face aligned into a position where the face appears symmetric (say a position where the individual looks straight into a camera), as described in further detail hereinbelow.
  • the uniform face alignment may ease identification of a face in a new image, through comparison with images in the face database.
  • the identification may be eased, since the uniform face alignment may increase similarity between face images of the same individual, especially as far as two dimensional (2D) images are concerned.
  • FAR False Acceptance Rate
  • FRR False Rejection Rate
  • FIG. 1 is a block diagram illustrating a first networked system for face recognition, according to an exemplary embodiment of the present invention.
  • the first networked system for face recognition includes a face verifier 110 .
  • the face verifier 110 is implemented on a client computer, say on a computer associated with an ATM (Automatic Teller Machine) or on an end station of a Passenger Authentication System, as described in further detail hereinbelow
  • the face verifier 110 verifies compliance of a face in one or more image(s), (say a still video image of a face of an individual, a sequence of video images of an individual, etc.), with a predefined criterion.
  • the predefined criterion may pertain to a statistical model run over previously received images, a comparison made between the image and one or more images previously captured from the same user, symmetry, etc., as described in further detail hereinabove.
  • the criterion may be a symmetry criterion defined by a user of the first system, say using a Graphical user Interface (GUI) implemented as a part of the face verifier 110 , as known in the art.
  • GUI Graphical user Interface
  • the face verifier 110 uses an intensity map, for verifying the symmetry of the face in the image, as described in further detail hereinbelow.
  • the face verifier 110 uses a texture map (i.e. gradient map), for verifying the symmetry of the face in the image, as described in further detail hereinbelow.
  • a texture map i.e. gradient map
  • the face verifier 110 uses a Fast Fourier Transform (FFT) phase map, for verifying the symmetry of the face in the image, as described in further detail hereinbelow.
  • FFT Fast Fourier Transform
  • the face is a face of an individual who is a collaborating user.
  • the face may belong to a user who may be asked to move into a better position.
  • the user collaborates by moving into a better aligned position (say a position where the user looks directly into a still camera).
  • a new image of the user's face, as captured from the better aligned position, may be more symmetric, as described in further detail hereinbelow.
  • the images are a part of a video sequence, and the video sequence is continuously fed to the face verifier 110 , say from a surveillance system (such as a video camera which continuously captures images of a secure area), as described in further detail hereinbelow.
  • the face verifier 110 verifies the symmetry of face in one or more of the images.
  • the first networked system for face recognition further includes a feature extractor 112 , in communication with the face verifier 110 .
  • the face verifier 110 Upon successfully verifying that the face complies with the predefined criterion, say the symmetry criterion, the face verifier 110 forwards the image to the feature extractor 112 .
  • the predefined criterion say the symmetry criterion
  • the face verifier 110 measures compliance of the face with the predefined criterion in each image of a video sequence fed to the face verifier 110 . Then, the face verifier 110 selects the one or more image(s) of the face amongst the input images, such that the measured compliance of the selected images of the face is highest amongst the input images. The selected images are forwarded to the feature extractor 112 .
  • the feature extractor 112 extracts one or more features from the image.
  • the extracted features are based on parts of the face which are known to be most invariant under changes of illumination, noise, pose, aging, etc., as known in the art.
  • the extracted features may be biometric stamps, as known in art.
  • the feature extractor 112 may use PCA (Principle Component Analysis) Projections, in order to generate a vector which is used as a biometric stamp of the image (i.e. a feature of the image), as known in art.
  • PCA Principal Component Analysis
  • the feature extractor 112 may use one or more feature extraction methods currently known in the art.
  • the feature extraction methods used by the feature extractor 112 may include, but are not limited to: PCA (Principal Component Analysis), ICA (Independent Component Analysis), LDA (Linear Discriminating Analysis), EP (Evolutionary Pursuit), EBGM (Elastic Bunch Graph Matching), Kernel Methods, Trace Transformations, AAM (Active Appearance Model), Three Dimensional Morphable Modeling, Bayesian Frameworks, SVM (Support Vector Machines), HMM (Hidden Markov Models), etc., as known in the art.
  • the first networked system for face recognition further includes a face identifier 120 .
  • the face identifier 120 communicates with the feature extractor 112 over a computer network 115 .
  • the network 115 is a wide area network (say the internet) or an intranet network.
  • An intranet network is an organization's internal or restricted access network that is similar in functionality to the internet, but is only available to the organization internally.
  • the face identifier 120 identifies the face in the image, by matching the feature(s) extracted from the image (say a biometric stamp) with one or more features (say biometric stamps) stored in the database, in advance, as described in further detail hereinbelow.
  • the face identifier 120 may use a database of features previously extracted from face images of known individuals, say known criminals.
  • each feature (or a group of features) is stored associated with data identifying a specific one of the individuals.
  • Exemplary data identifying the individual may include, but is not limited to: name, address, phone numbers, etc.
  • FIG. 2 is a block diagram illustrating a second networked system for face recognition, according to an exemplary embodiment of the present invention.
  • the second networked system for face recognition includes a face verifier 210 .
  • the face verifier 210 is implemented on a client computer, say on a computer associated with an ATM (Automatic Teller Machine), as described in further detail hereinbelow.
  • ATM Automatic Teller Machine
  • the face verifier 210 verifies compliance of a face in one or more image(s), (say a still video image of a face of an individual, a sequence of video images of an individual, etc.), with a predefined criterion.
  • the predefined criterion may pertain to a statistical model run over images previously received, a criterion based on a comparison made between the image and one or more images previously captured from the same user, a symmetry criterion, etc., as described in further detail hereinabove.
  • the criterion may be a symmetry criterion defined by a user of the first system, say using a Graphical user Interface (GUI) implemented as a part of the face verifier 210 , as known in the art.
  • GUI Graphical user Interface
  • the face verifier 210 verifies symmetry of a face in one or more image(s), (say a still video image of a face of an individual, a sequence of video images of an individual, etc.), according to a symmetry criterion.
  • the symmetry criterion may be defined by a user of the second system, as described in further detail hereinbelow.
  • the face verifier 210 uses an intensity map, for verifying the symmetry of the face in the image, as described in further detail hereinbelow.
  • the face verifier 210 uses a texture map (i.e. gradient map), for verifying the symmetry of the face in the image, as described in further detail hereinbelow.
  • a texture map i.e. gradient map
  • the face verifier 210 uses a Fast Fourier Transform (FFT) phase map, for verifying the symmetry of the face in the image, as described in further detail hereinbelow.
  • FFT Fast Fourier Transform
  • the face is a face of an individual who is a collaborating user, as described in further detail hereinabove.
  • the face may belong to a user who may be asked to move into a better position.
  • the user collaborates by moving into a better aligned position (say a position where the user looks directly into a still camera).
  • a new image of the user's face, as captured from the better aligned position, may be more symmetric, as described in further detail hereinbelow.
  • the images are a part of a video sequence, and the video sequence is continuously fed to the face verifier 210 , say from a surveillance system, as described in further detail hereinbelow.
  • the face verifier 210 verifies the symmetry of the face in the images of the video sequence.
  • the second networked system for face recognition further includes a feature extractor 218 .
  • the face verifier 210 communicates with the feature extractor 218 over a computer network 215 .
  • the network 218 is a wide area network (say the internet) or an intranet network, as described in further detail hereinabove.
  • the face verifier 210 Upon successfully verifying the symmetry of the face in the image, the face verifier 210 sends the image to the feature extractor 218 , over the network 215 .
  • the face verifier 210 measures symmetry of each image of a video sequence fed to the face verifier 210 . Then, the face verifier 210 selects the one or more image(s) of the face amongst the input images, such that the measured symmetry of the selected images of the face is highest amongst the input images. The face verifier 210 sends the selected images to the feature extractor 218 , over the network 215 .
  • the feature extractor 218 extracts one or more features from the image.
  • the extracted features are based on parts of the face which are known to be most invariant under changes of illumination, noise, pose, aging, etc., as known in the art.
  • the feature extractor 218 may use one or more feature extraction methods currently known in the art.
  • the feature extraction methods used by the feature extractor 218 may include, but are not limited to: PCA (Principal Component Analysis), ICA (Independent Component Analysis), LDA (Linear Discriminating Analysis), EP (Evolutionary Pursuit), EBGM (Elastic Bunch Graph Matching), Kernel Methods, Trace Transformations, AAM (Active Appearance Model), Three Dimensional Morphable Modeling, Bayesian Frameworks, SVM (Support Vector Machines), HMM (Hidden Markov Models), etc., as known in the art.
  • the second networked system for face recognition further includes a face identifier 220 , in communication with the feature extractor 218 .
  • the face identifier 220 identifies the face in the image, by matching the features extracted from the image with one or more features stored in database, in advance.
  • the face identifier 220 may use a database of features previously extracted from face images of known individuals, say known criminals.
  • each feature (or a group of features) is stored associated with data identifying a specific one of the individuals.
  • Exemplary data identifying the individual may include, but is not limited to: name, address, phone numbers, etc.
  • the face identifier 220 matches between the feature(s) extracted from the image and feature(s) already stored in the database, and identifies the face as belonging to the individual whose name, address and phone numbers are associated with the feature(s) matched.
  • FIG. 3 is a block diagram illustrating a third networked system for face recognition, according to an exemplary embodiment of the present invention.
  • the third networked system for face recognition includes a face verifier 310 .
  • the face verifier 310 is implemented on a client computer, say on a computer associated with an ATM (Automatic Teller Machine), as described in further detail hereinbelow.
  • ATM Automatic Teller Machine
  • the face verifier 310 verifies compliance of a face in one or more image(s), (say a still video image of a face of an individual, a sequence of video images of an individual, etc.), with a predefined criterion.
  • the predefined criterion may pertain to a statistical model run over images previously received, say by calculating an average image, as described in further detail hereinabove.
  • the predefined criterion may be based on a comparison made between the image and one or more images previously captured from the same user, a symmetry criterion, etc., as described in further detail hereinabove.
  • the criterion may be a symmetry criterion defined by a user of the first system, say using a Graphical user Interface (GUI) implemented as a part of the face verifier 310 , as known in the art.
  • GUI Graphical user Interface
  • the face verifier 310 verifies symmetry of a face in one or more image(s), (say a still video image of a face of an individual, a sequence of video images of an individual, etc.), according to a symmetry criterion.
  • the symmetry criterion may be defined by a user of the fourth system, as described in further detail hereinbelow.
  • the face verifier 310 uses an intensity map, for verifying the symmetry of the face in the image, as described in further detail hereinbelow.
  • the face verifier 310 uses a texture map (i.e. gradient map), for verifying the symmetry of the face in the image, as described in further detail hereinbelow.
  • a texture map i.e. gradient map
  • the face verifier 310 uses a Fast Fourier Transform (FFT) phase map, for verifying the symmetry of the face in the image, as described in further detail hereinbelow.
  • FFT Fast Fourier Transform
  • the face is a face of an individual who is a collaborating user, as described in further detail hereinabove.
  • the face may belong to a user who may be asked to move into a better position.
  • the user collaborates by moving into a better aligned position (say a position where the user looks directly into a still camera).
  • a new image of the user's face, as captured from the better aligned position, may be more symmetric, as described in further detail hereinbelow.
  • the images are a part of a video sequence, and the video sequence is continuously fed to the face verifier 310 , say from a surveillance system, as described in further detail hereinbelow.
  • the face verifier 310 verifies the symmetry of face in one or more of the images.
  • the third networked system for face recognition further includes a feature extractor 317 .
  • the face verifier 310 communicates with the feature extractor 317 over a first computer network 315 .
  • the first network 315 is a wide area network (say the internet) or an intranet network, as described in further detail hereinabove.
  • the face verifier 310 Upon successfully verifying the symmetry of the face in the image, the face verifier 310 sends the image to the feature extractor 317 , over the first network 315 .
  • the face verifier 310 measures symmetry of each image of a video sequence fed to the face verifier 310 . Then, the face verifier 310 selects the one or more image(s) of the face amongst the input images, such that the measured symmetry of the selected images of the face is highest amongst the input images. The face verifier 310 sends the selected images the feature extractor 317 , over the first network 315 .
  • the feature extractor 317 extracts one or more features from the image.
  • the extracted features are based on parts of the face which are known to be most invariant under changes of illumination, noise, pose, aging, etc., as known in the art.
  • the feature extractor 317 may use one or more feature extraction methods currently known in the art.
  • the feature extraction methods used by the feature extractor 317 may include, but are not limited to: PCA (Principal Component Analysis), ICA (Independent Component Analysis), LDA (Linear Discriminating Analysis), EP (Evolutionary Pursuit), EBGM (Elastic Bunch Graph Matching), Kernel Methods, Trace Transformations, AAM (Active Appearance Model), Three Dimensional Morphable Modeling, Bayesian Frameworks, SVM (Support Vector Machines), HMM (Hidden Markov Models), etc., as known in the art.
  • the third networked system for face recognition further includes a face identifier 320 .
  • the face identifier 320 communicates with the feature extractor 317 over a second computer network 319 .
  • the second network 319 may be the same network as the first network 315 (that is to say that the face verifier 310 , the feature extractor 317 , and the face identifier 320 , are all connected by the same network, say the internet).
  • the second network 319 is another network, be it an intranet network, the internet, or another wide area network, as described in further detail hereinabove.
  • the face identifier 320 identifies the face in the image, by matching the features extracted from the image with one or more features stored in database, in advance.
  • the face identifier 320 may use a database of features previously extracted from face images of known individuals, say known criminals.
  • each feature (or a group of features) is stored associated with data identifying a specific one of the individuals.
  • Exemplary data identifying the individual may include, but is not limited to: name, address, phone numbers, etc.
  • FIG. 4 is a block diagram illustrating a fourth networked system for face recognition, according to an exemplary embodiment of the present invention.
  • the fourth networked system for face recognition includes a face verifier 410 .
  • the face verifier 410 is implemented on a client computer, say on a computer associated with an ATM (Automatic Teller Machine), an end station at an entrance of a secure area, etc.
  • a client computer say on a computer associated with an ATM (Automatic Teller Machine), an end station at an entrance of a secure area, etc.
  • ATM Automatic Teller Machine
  • the face verifier 410 is implemented on an end station of a Passenger Authentication System.
  • the station is deployed by the entrance of a plane and used for ensuring that only a person granted a boarding pass boards the plane (and not an impostor), etc.
  • the face verifier 410 verifies compliance of a face in one or more image(s), (say a still video image of a face of an individual, a sequence of video images of an individual, etc.), with a predefined criterion.
  • the predefined criterion may pertain to a statistical model run over images previously received, a criterion based on a comparison made between the image and one or more images previously captured from the same user, a symmetry criterion, etc., as described in further detail hereinabove.
  • the criterion may be a symmetry criterion defined by a user of the fourth system, say using a Graphical user Interface (GUI) implemented as a part of the face verifier 410 , as known in the art.
  • GUI Graphical user Interface
  • the face verifier 410 verifies symmetry of a face in one or more image(s), (say a still video image of a face of an individual, a sequence of video images of an individual, etc.), according to a symmetry criterion.
  • the symmetry criterion may be defined by a user of the fourth system, as described in further detail hereinbelow.
  • the face verifier 410 uses an intensity map, for verifying the symmetry of the face in the image, as described in further detail hereinbelow.
  • the face verifier 410 uses a texture map (i.e. gradient map), for verifying the symmetry of the face in the image, as described in further detail hereinbelow.
  • a texture map i.e. gradient map
  • the face verifier 410 uses a Fast Fourier Transform (FFT) phase map, for verifying the symmetry of the face in the image, as described in further detail.
  • FFT Fast Fourier Transform
  • the face is a face of an individual who is a collaborating user.
  • the face may belong to a user who may be asked to move into a better position.
  • the user collaborates by moving into a better aligned position (say a position where the user looks directly into a still camera).
  • a new image of the user's face, as captured from the better aligned position, may be more symmetric, as described in further detail hereinbelow.
  • the images are a part of a video sequence, and the video sequence is continuously fed to the face verifier 410 , say from a surveillance system (such as a video camera which continuously captures images of a secure area), as described in further detail hereinbelow.
  • a surveillance system such as a video camera which continuously captures images of a secure area
  • the face verifier 410 verifies the symmetry of face in each of the images.
  • the face verifier 410 when the face verifier 410 successfully verifies the symmetry of the face in one of the images, the face verifier 410 sends data to a remote face identifier 420 , as described in further detail hereinbelow.
  • the sent data includes the whole image.
  • the sent data includes only a part of the image.
  • the face verifier 410 may extract one or more features from the image, using known in the art feature extraction methods, as described in further detail hereinabove.
  • the data which includes the features extracted from the image is sent to a remote face identifier 420 , as described in further detail hereinabove.
  • the face verifier 410 measures symmetry of each one of two or more images of the video sequence fed to the face verifier 410 .
  • the face verifier 410 selects one or more image(s) of the face amongst the input images, such that the measured symmetry of the selected images of the face is highest amongst the input images. Consequently, data which includes at least a part of each of the selected images is sent to a remote face identifier 420 , over a network, as described in further detail hereinbelow.
  • the fourth system further includes a remote face identifier 420 .
  • the face verifier 410 communicates with the face identifier 420 over a computer network 415 .
  • the network 415 is a wide area network (say the internet) or an intranet network.
  • the face identifier 420 identifies the face.
  • the face identifier 420 may use any of currently used face identification methods, for identifying the face, as described in further detail hereinabove.
  • the face identifier 420 may receive data, which includes the whole image (or a part from the image), from the 410 .
  • the face identifier 420 may extract one or more features from the image (or from the part of the image).
  • the face identifier 420 identifies the face in the image sent by the face verifier 410 , by matching the extracted features with feature data stored in a face database 450 , in advance of the matching.
  • the feature data is stored in the face database 450 , together with personal dataidentifying individuals.
  • the face identifier 420 identifies the face in the image sent by the face verifier 410 , as belonging to an individual having the personal data associated with the matched feature data.
  • the fourth system further includes an image capturer, connected to the face verifier 410 .
  • the image capturer may include, but is not limited to a digital still camera, a video camera, a web camera, etc.
  • the image capturer captures the image(s) of the face, and forwards the captured image(s) to the face verifier 410 .
  • the face verifier 410 when the face verifier 410 finds the face in the image non-symmetric (say when the face fails to meet the symmetry criterion), the face verifier 410 instructs the image capturer (say the digital still camera) to capture a new image of the face.
  • the image capturer say the digital still camera
  • the face verifier 410 upon finding the face non-symmetric, presents an appropriate message (say a message asking an individual whose face image is captured to look straight into the image capturer, etc.), and the face capturer captures a new image of the face, as described in further detail hereinbelow.
  • an appropriate message say a message asking an individual whose face image is captured to look straight into the image capturer, etc.
  • the fourth system further includes a face detector, in communication with the face verifier 410 .
  • the face detector detects the face in the image.
  • the face detector may use one or more known in the art methods for detecting the face in the image, including, but not limited to: a skin detection method, a Viola-Jones detection method, a Gabor Filter based method, etc., as described in further detail hereinbelow.
  • the fourth system further includes an image cropper, connected to the face verifier 410 .
  • the image cropper crops the image, and thereby significantly removes background from the image.
  • the image cropper crops the image around the face, leaving a purely facial image (i.e. an image which includes only the face, without background details).
  • the image cropper crops the image, along a rectangle, as illustrated using FIG. 16 , and described in further detail hereinbelow.
  • the fourth system also includes an image resizer, in communication with the face verifier 410 .
  • the image resizer resizes the image into a predefined size, and thereby standardizes the image's size according to a size standard predefined by a user of the fourth system, as described in further detail hereinbelow.
  • the size standard may improve accuracy and efficiency of a face identifier 420 , as described in further detail hereinbelow.
  • the fourth system further includes an image illumination quality improver, in communication with the face verifier 410 .
  • the image illumination quality improver may improve one (or more) qualities of illumination of the image, say using Histogram Equalization, as known in the art and described in further detail hereinbelow.
  • FIG. 5 is a block diagram illustrating a fifth networked system for face recognition, according to an exemplary embodiment of the present invention.
  • a fifth networked system for face recognition includes a face verifier 510 .
  • the face verifier 510 is implemented on a client computer, say on a computer associated with an ATM (Automatic Teller Machine) or on an end station of a Passenger Authentication System, as described in further detail hereinabove.
  • a client computer say on a computer associated with an ATM (Automatic Teller Machine) or on an end station of a Passenger Authentication System, as described in further detail hereinabove.
  • ATM Automatic Teller Machine
  • end station of a Passenger Authentication System as described in further detail hereinabove.
  • the face verifier 510 verifies compliance of a face in one or more image(s), (say a still video image of a face of an individual, a sequence of video images of an individual, etc.), with a predefined criterion.
  • the predefined criterion may pertain to a statistical model run over images previously received, a criterion based on a comparison made between the image and one or more images previously captured from the same user, a symmetry criterion, etc., as described in further detail hereinabove.
  • the criterion may be a symmetry criterion defined by a user of the fifth system, say using a Graphical user Interface (GUI) implemented as a part of the face verifier 510 , as known in the art.
  • GUI Graphical user Interface
  • the face verifier 510 verifies symmetry of a face in one or more image(s), say a sequence of video images of an individual, according to the symmetry criterion.
  • the symmetry criterion may be based on an intensity map, a phase map, a texture map, etc., as described in further detail hereinbelow.
  • the face verifier 510 uses an intensity map, a gradient map, a Fast Fourier Transform (FFT) phase map, or a combination thereof, for verifying the symmetry of the face in the image(s), as described in further detail hereinbelow.
  • FFT Fast Fourier Transform
  • the face verifier 510 measures symmetry of each one of two or more input images (say images which are a part of a sequence of video images, or a video stream). Then, the face verifier 510 selects the one or more image(s) of the face amongst the input images, such that the measured symmetry of the selected images of the face is highest amongst the input images.
  • the face verifier 510 may further receive data identifying the face from a user, say using a user interface implemented as a part of the face verifier 510 , or a user interface in association therewith, as known in the art.
  • the user may be an operator of the fifth system, the person whose face is captured in the image(s), etc.
  • the data identifying face may include, but is not limited to details such as a passport number, a name, or an address.
  • the fifth system further includes a face database updater 530 .
  • the face verifier 510 communicates with the face database updater 530 over a computer network 515 .
  • the network 515 is a wide area network (say the internet) or an intranet network.
  • the intranet network 515 may connect computers and ATMs (Automatic Teller Machines) in one or more branches and offices of a commercial bank.
  • ATMs Automatic Teller Machines
  • the face verifier 510 When the face verifier 510 successfully verifies the symmetry of the face in one of the images (say the face of a criminal), the face verifier 510 sends data, over the network 515 , to the face database updater 530 .
  • the sent data may include the whole image, or a part of the image, say one or more features extracted from the image, such as biometric stamps, as described in further detail hereinabove.
  • the sent data further includes data identifying the face, as described in further detail hereinbelow.
  • the face database updater 530 updates a face database 550 with the received data, or a part thereof, associated with the data identifying the face, as described in further detail hereinbelow.
  • the received data includes only one or more features extracted from the face (i.e. a part of the image).
  • the received data includes the whole image (or a significant part thereof).
  • the face database updater 530 extracts one or more features from the image (or from the significant part), and stores the extracted features in the face database 550 .
  • the face database 550 may be a local database, a remote database accessible through the Internet, etc., as known in the art.
  • FIG. 6 is a block diagram illustrating a sixth networked system for face recognition, according to an exemplary embodiment of the present invention.
  • a sixth networked system for face recognition includes a face verifier 610 .
  • the face verifier 610 verifies compliance of a face in one or more image(s), (say a still video image of a face of an individual, a sequence of video images of an individual, etc.), with a predefined criterion.
  • the predefined criterion may pertain to a statistical model run over images previously received, a criterion based on a comparison made between the image and one or more images previously captured from the same user, a symmetry criterion, etc., as described in further detail hereinabove.
  • the criterion may be a symmetry criterion defined by a user of the sixth system, as described in further detail hereinabove.
  • the face verifier 610 receives one or more first image(s) of a face, together with data identifying the face, as described in further detail hereinabove.
  • the data identifying the face may include, but is not limited to details such as a passport number, a name, or an address.
  • the details may be provided by an operator of the sixth system, by an individual whose face is captured in the image, etc.
  • the face verifier 610 verifies symmetry of a face in one or more of the first image(s), according to a symmetry criterion.
  • the first image(s) may include a still video image of a face of an individual who enrolls in a security system, or a sequence of video images of a known criminal the police wishes to store in a database of criminal suspects.
  • the symmetry criterion may be defined by a user of the sixth system, say using a Graphical User Interface (GUI), as described in further detail hereinbelow.
  • GUI Graphical User Interface
  • the symmetry criterion may be based on an intensity map, a phase map, a texture map, etc., as described in further detail hereinbelow.
  • the face verifier 610 uses an intensity map, a gradient map, a Fast Fourier Transform (FFT) phase map, or a combination thereof, for verifying the symmetry of the face in the first image(s), as described in further detail hereinbelow.
  • FFT Fast Fourier Transform
  • the face verifier 610 measures symmetry of each one of two or more images input to the face verifier 610 , say images which are a part of a sequence of video images streamed to the face verifier 610 . Then, the face verifier 610 selects one or more first image(s) of the face amongst the input images, such that the measured symmetry of the selected image(s) of the face is highest amongst the input image(s).
  • the sixth system further includes a face database updater 630 ,
  • the face verifier 610 communicates with the face database updater 630 over a computer network 615 .
  • the network 615 is a wide area network (say the internet) or an intranet network, as described in further detail hereinabove.
  • the face verifier 610 When the face verifier 610 successfully verifies the symmetry of the face in one (or more) of the first images (say the face of a criminal), the face verifier 610 sends data over the network 615 , including data identifying the face.
  • the sent data includes the whole of the first image(s).
  • the sent data includes only a part of each of the first image(s).
  • the data may include one or more features extracted from each of the first image(s), by the face verifier 610 , say a biometric stamp extracted from the first image, as described in further detail hereinabove.
  • the face database updater 630 updates a face database 650 with the received data or with features extracted from the received data, as described in further detail hereinbelow.
  • the face database updater 630 updates the face database 650 with the images selected by the face verifier 610 or with data extracted from the selected images, as described in further detail hereinabove.
  • the sixth system further includes a face identifier 620 .
  • the face verifier 610 communicates with the face identifier 620 over the computer network 615 , say over the intranet, as described in further detail hereinabove.
  • the face verifier 610 verifies the symmetry of the face in the second image, according to the predefined symmetry criterion.
  • the face verifier 610 sends data which includes at least a part of the second image, over the network 615 , as described in further detail hereinabove.
  • the face identifier 620 receives the data sent by face verifier 610 .
  • the face identifier 620 identifies the face in the second images, say using the face database 650 , and the received data, as described in further detail hereinabove.
  • an authorized user of classified information system enrolls in the classified information system.
  • a first image of the authorized user's face is input to the face verifier 610 (say a passport photo), together with data identifying the authorized user, say using a Graphical User Interface (GUI).
  • GUI Graphical User Interface
  • the data identifying the user may include, but is not limited to details such as a passport number, a name, an address, a role, etc. The details may be provided by an operator of the sixth system, by the authorized user himself, etc.
  • the face verifier 610 verifies the symmetry of the authorized user's face in the first image
  • the face verifier 610 sends data which includes the first image (or a part of the first image) to the database updater 630 , over the network 615 , together with the data identifying the authorized user.
  • the database updater 630 updates the face database 650 with the received data.
  • a second image of his face is captured live, say by a still camera in communication with the classified information system.
  • the face verifier 610 receives the second image and verifies the symmetry of the authorized user's face in the second image.
  • the face verifier 610 sends data, which includes the second image (or a part of the second image) over the network 615 .
  • the face identifier 620 receives the sent data and identifies the face in the second image, using the face database 650 , as described in further detail hereinbelow.
  • the authorized user upon positive identification of the authorized user's face, the authorized user is allowed to log into the classified information system.
  • FIG. 7 is a block diagram illustrating a seventh networked system for face recognition, according to an exemplary embodiment of the present invention.
  • a seventh networked system for face recognition includes a face verifier 710 .
  • the face verifier 710 verifies compliance of a face in one or more image(s), with a predefined criterion.
  • the predefined criterion may pertain to a statistical model run over images previously received, a comparison made between the image and one or more images previously captured from the same user, symmetry, etc., as described in further detail hereinabove.
  • the criterion may be a symmetry criterion defined by a user of the seventh system, say using a Graphical user Interface (GUI) implemented as a part of the face verifier 710 , as known in the art.
  • GUI Graphical user Interface
  • the face verifier 710 verifies symmetry of a face in one or more image(s) received by the face verifier 710 .
  • the images may include, but are not limited to a still video image of a face of an individual, or a sequence of video images of an individual.
  • the face verifier 710 verifies the symmetry according to a symmetry criterion.
  • the symmetry criterion may be based on an intensity map, a phase map, a texture map, etc., as described in further detail hereinbelow.
  • the face verifier 710 uses an intensity map, a gradient map, a fast Fourier Transform (FFT) phase map, an image processing filter output (as known in art), or a combination thereof, for verifying the symmetry of the face in the image(s), as described in further detail hereinbelow.
  • FFT fast Fourier Transform
  • the face verifier 710 further restricts forwarding of data, which includes at least a part of the image to a remote receiver over a network 715 (say the internet), according to results of the verification of the symmetry by the face verifier 710 .
  • the face verifier 710 may restrict the forwarding of images to a party, who offers face identification services over the internet.
  • the face verifier 710 finds the face in the image to be non-symmetric (i.e. the face fails to meet the symmetry criterion). Consequently, the face verifier 710 blocks the forwarding of data which includes the image (or a part thereof) to the image identifier 420 described hereinabove (using FIG. 4 ), through a network 715 (say the internet), as described in further detail hereinabove.
  • the face verifier 710 also presents an appropriate message.
  • the face verifier 710 may present a message asking an individual whose face image is captured to look straight into an image capturer (say, a still camera), or to align in a position in front of the image capturer, as described in further detail hereinbelow.
  • an image capturer say, a still camera
  • the image capturer may capture a new (and hopefully, symmetric) image of the face of the individual.
  • the face verifier 710 finds that the face successfully meets the symmetry criterion (and is thus successfully verified), the face verifier 710 forwards data which includes at least a part of the image to the image identifier 420 , through the network 715 , as described in further detail hereinabove.
  • the face verifier 710 may forward the data to one or more destination(s) set in advance of the verification of the symmetry, say by an operator of the seventh system.
  • the destination(s) may include, but are not limited to: an email address, a database server of a third party, or an application (which may run on a remote computer, etc.).
  • the seventh system may be used as a stand alone product, or in combination with other systems, say a face recognition system, a security system, etc.
  • FIG. 8 is a flowchart illustrating a first method for face recognition, according to an exemplary embodiment of the present invention.
  • a first method for face recognition there is verified 810 the compliance of a face with a predefined criterion, as described in further detail hereinabove.
  • the compliance of a face in one or more image(s), may be verified 810 using a predefined criterion.
  • the predefined criterion may pertain to a statistical model run over images previously received, a criterion based on a comparison made between the image and one or more images previously captured from the same user, a symmetry criterion, etc., as described in further detail hereinabove.
  • the criterion may be a symmetry criterion defined by a user of the fourth system, say using a Graphical user Interface (GUI) implemented as a part of the face verifier 410 , as described in further detail and illustrated using FIG. 4 hereinabove.
  • GUI Graphical user Interface
  • the symmetry of a face in one or more image(s), is automatically verified 810 according to a symmetry criterion, say using the face verifier 410 , as described in further detail hereinbelow.
  • the symmetry criterion may be based on an intensity map, a phase map, a texture map, an image processing filter, etc., as described in further detail hereinbelow.
  • the first method further includes using an intensity map, for verifying 810 the symmetry of the face in the image, as described in further detail, hereinbelow.
  • the first method further includes using a gradient map, for verifying 810 the symmetry of the face in the image, as described in further detail, hereinbelow.
  • the first method further includes using a fast Fourier Transform (FFT) phase map, for verifying 810 the symmetry of the face in the image, as described in further detail, hereinbelow.
  • FFT fast Fourier Transform
  • the first method further includes measuring symmetry of each one of two or more input images (say images which are a part of a sequence of video images, or a video stream). Then, the one or more image(s) of the face are selected amongst the input images, such that the measured symmetry of the selected images of the face is highest amongst the input images.
  • data of the image is sent 815 through a computer network (say the internet), for identification.
  • a computer network say the internet
  • the data of the image may include the whole image, or a part thereof (say one or more features extracted from the image, such as a biometric stamp, as described in further detail hereinabove).
  • the face in the image is identified 820 , using the data, or features extracted from the data, say by the face identifier 420 , as described in further detail hereinabove.
  • the first method further includes a preliminary step of capturing the image of the face, and forwarding the captured image, for the symmetry verification, (say to the face verifier 410 ), as described in further detail hereinabove.
  • the first method further includes detecting the face in the image, say using the face detector, as described in further detail hereinbelow.
  • the detection of the face may be carried out using one or more methods, as known in the art, including, but not limited to: a skin detection method, a Viola-Jones detection method, a Gabor Filter based method, etc., as described in further detail hereinbelow.
  • the first method further includes cropping the image.
  • the cropping may be carried out around the face, thereby leaving a purely facial image (i.e. substantially without background).
  • the cropping may be carried out along a rectangle, significantly removing background from the image, as illustrated using FIG. 16 , and described in further detail hereinbelow.
  • the first method further includes resizing the image into a predefined size, and thereby standardizing the image's size according to a predefined size standard, as described in further detail hereinbelow.
  • the first method further includes improving one or more qualities of illumination of the image, say using Histogram Equalization methods, as described in further detail hereinbelow.
  • FIG. 9 is a flowchart illustrating a second method for face recognition, according to an exemplary embodiment of the present invention.
  • a second method for face recognition there is verified 910 the compliance of a face with a predefined criterion, as described in further detail hereinabove.
  • the compliance of a face in one or more image(s) may be verified 910 with a predefined criterion.
  • the predefined criterion may pertain to a statistical model run over images previously received, a criterion based on a comparison made between the image and one or more images previously captured from the same user, a symmetry criterion, etc., as described in further detail hereinabove.
  • symmetry of a face in one or more image(s), is verified 910 according to a symmetry criterion, say using the face verifier 510 , as described in further detail hereinbelow, and illustrated using.
  • the symmetry criterion may be defined by a user of the fifth system described in further detail hereinbelow.
  • the second method further includes using an intensity map, for verifying 910 the symmetry of the face in the image, as described in further detail, hereinbelow.
  • the second method further includes using a gradient map, for verifying 910 the symmetry of the face in the image, as described in further detail, hereinbelow.
  • the second method further includes using a fast Fourier Transform (FFT) phase map, for verifying 910 the symmetry of the face in the image, as described in further detail, hereinbelow.
  • FFT fast Fourier Transform
  • the second method further includes measuring symmetry of each one of two or more input images (say images which are a part of a sequence of video images, or a video stream). Then, one or more image(s) of the face are selected amongst the input images, such that the measured symmetry of the selected images of the face is highest amongst the input images.
  • data of the selected image(s) is sent 915 to a face database updater 530 in communication with the face verifier 510 .
  • the data of the selected images is sent 915 over the computer network 515 , say over the internet or an intranet network, as described in further detail hereinabove.
  • the sent data may include the whole image, or a part thereof (say one or more features extracted from the image, as described in further detail hereinabove).
  • a face database 550 is updated 930 with the data of the selected image(s) and associated data identifying the face, say by the face database updater 530 , as described in further detail hereinabove.
  • the face database 550 is updated with one or more features extracted from the received data, as described in further detail hereinabove.
  • the data identifying face may include, but is not limited to details such as a passport number, a name, or an address.
  • the details may be provided by an operator of the second system described hereinabove, by an individual whose face is captured in the image, etc.
  • FIG. 10 is a flowchart illustrating a third method for face recognition, according to an exemplary embodiment of the present invention.
  • a third method there is verified 1010 the compliance of a face in a first image with a predefined criterion, as described in further detail hereinabove.
  • the compliance of a face in one or more first image(s) may be verified 1010 with a criterion which pertains to a statistical model run over images previously received, a criterion based on a comparison made between the image and one or more images previously captured from the same user, a symmetry criterion, etc., as described in further detail hereinabove.
  • symmetry of a face in one or more first image(s), is verified 1010 according to a symmetry criterion, say using the face verifier 610 , as described in further detail hereinbelow.
  • the symmetry criterion may be defined by a user of the sixth system described in further detail hereinbelow.
  • symmetry of a face in one or more first image(s) is verified 1010 , according to a symmetry criterion, say using the face verifier 610 , as described in further detail hereinbelow.
  • the first image(s) may include a passport photo of an individual who enrolls in a security system, a sequence of video images of a known criminal the police wishes to store in a database of criminal suspects, etc.
  • the symmetry criterion may be defined by a user of the sixth system, as described in further detail hereinbelow.
  • the verification of the symmetry of the first image(s) is carried out using an intensity map, a texture map (i.e. gradient map), a fast Fourier Transform (FFT) phase map, or a combination thereof, as described in further detail hereinbelow.
  • a texture map i.e. gradient map
  • FFT fast Fourier Transform
  • the data of the first images is sent 1030 to a database 650 in communication with the face verifier 610 over a computer network (say the internet).
  • the sent data may include whole images, or a part thereof (say one or more features extracted from the image, as described in further detail hereinabove).
  • the database 650 is updated 1030 with the sent data and associated data identifying the face.
  • the data is sent 1030 (and updated), only if the symmetry of the face in the first image(s) is successfully verified 1010 , say by the face verifier 610 , as described in further detail hereinabove.
  • the data identifying face may include, but is not limited to details such as a passport number, a name, or an address.
  • the details may be provided by an operator of the sixth system described hereinabove, by the individual whose face is captured in the image, etc.
  • the face verifier 610 When one or more second image(s) of the face are presented to the face verifier 610 (say, a video stream of a criminal who attempts to walk into a secure area), the symmetry of the face in the second image(s) is verified 1070 , according to the predefined symmetry criterion, as described in further detail hereinabove.
  • data of the second image(s) is sent 1090 to the face identifier 620 , and the face in the second image(s) is identified 1090 , say by the face identifier 620 , using the face database 650 , as described in further detail hereinabove.
  • a police unit may wish to store a face image together with identifying data of a known criminal in a suspect database.
  • the symmetry of the known criminal's face in the first image is verified 1010 , say using the face verifier 610 , as described in further detail hereinbelow.
  • the suspect database is updated 1030 with the first image of the known criminal, together with the data identifying the known criminal.
  • a surveillance camera may capture a video stream (i.e. second images of the criminal) of the criminal in a crime scene.
  • the video stream may be used to identify the criminal, say using one of the systems described in further detail hereinabove.
  • the face in the second image may be identified 1090 , using the police unit's suspect database.
  • the police may arrest the known criminal, and use the video stream as evidence against the known criminal.
  • FIG. 11 is a flowchart illustrating a fourth method for face recognition, according to an exemplary embodiment of the present invention.
  • a fourth method there is verified 1110 the compliance of a face with a predefined criterion, as described in further detail hereinabove.
  • the compliance of a face in one or more image(s) may be verified 1110 with a criterion which pertains to a statistical model run over images previously received, a criterion based on a comparison made between the image and one or more images previously captured from the same user, a symmetry criterion, etc., as described in further detail hereinabove.
  • an image of a face is captured 1100 , say using an image capturer, such as a digital still camera, a video camera, or a surveillance camera (which constantly streams video images of a secure area).
  • an image capturer such as a digital still camera, a video camera, or a surveillance camera (which constantly streams video images of a secure area).
  • a user may approach a face recognition system, which includes the sixth system described in further detail hereinabove, as well as the image capturer (say a video camera).
  • a face recognition system which includes the sixth system described in further detail hereinabove, as well as the image capturer (say a video camera).
  • the image capturer may be triggered to capture the image of a user who approaches the face recognition system by a smart card reader connected to the image capturer.
  • the smart card reader Upon insertion of a smart cart into the smart card reader, by the user, the smart card reader triggers the image capturer, to capture the image of the user. Then, the captured image is forwarded for symmetry verification, as described in further detail hereinbelow.
  • the image capturer may be triggered to capture the image of the face of the user who approaches the face recognition system, by a RFID (Radio frequency identification) card reader connected to the image capturer.
  • the RFID card reader triggers the image capturer to capture the image, when the user inserts an RFID cart into the RFID reader. Then, the captured image is forwarded for symmetry verification, as described in further detail hereinbelow.
  • the image capturer continuously captures images.
  • the imager capturer may be a surveillance camera, which constantly streams video images of a secure area.
  • the image is forwarded to the face verifier 610 , as describe in further detail hereinabove.
  • the image is captured in a two dimensional (2D) format, as known in the art.
  • the image is captured in a three dimensional (3D) format, as known in the art.
  • the face in the captured image is verified 1110 , according to a symmetry criterion, say by the face verifier 610 , as described in further detail hereinabove.
  • the face in the image is found to be non-symmetric (i.e.
  • the image capturer is instructed (say by the face verifier 610 ) to capture a new image of the face.
  • the image capturer may present an appropriate message, say a message asking an individual whose face image is captured to look straight into the image capturer, or align in a position in front of the image capturer (say, a still camera), as described in further detail hereinbelow. Then, the image capturer captures a new image of the face.
  • an appropriate message say a message asking an individual whose face image is captured to look straight into the image capturer, or align in a position in front of the image capturer (say, a still camera), as described in further detail hereinbelow. Then, the image capturer captures a new image of the face.
  • data which includes the image (or a part of the image) is sent for identification, say over a wide area network, such as the internet, as described in further detail hereinabove.
  • the image is pre-processed 1180 , say by the face identifier 620 , using one of several pre-processing methods currently used for face recognition.
  • the pre-processing methods may be used for sharpening, grey scale modification, removal of red eyes, etc., as known in the art.
  • the face is identified 1190 , say by the face identifier 420 , as described in further detail hereinabove.
  • FIG. 12 is a flowchart illustrating a fifth method for face recognition, according to an exemplary embodiment of the present invention.
  • a video image is captured 1200 , say by a video still camera, as described in further detail hereinabove.
  • the face may be detected using one or more methods currently used for detecting a face in the image.
  • the methods currently used may include, but are not limited to: Viola-Jones detection methods, Gabor Jets based methods, skin detection methods, histogram analysis methods, or other methods (say methods based on edge maps, gradients, or standard face shapes, etc.), as known in the art).
  • Viola-Jones methods use several image processing filters over the whole image.
  • a neural network algorithm is trained over a training set (say a set of already processed face images).
  • the face is detected using the neural network algorithm and a search is made to find best match values that predict the face center location, as known in the art.
  • Gabor Jets methods use a convolution of Fourier Coefficient of the image with wavelets coefficients of low order, where the values that predict face location are set according to empirically found predictive values, as known in the art.
  • Skin detectors analyze an intensity map presentation of the image, in order to find that pixel intensity values which comply with standard skin values, as known in the art.
  • Histogram analysis methods analyze a histogram of the image, say a Pixel Frequency Histogram, after applying several filters (histogram normalization, histogram stretching, etc.) on the histogram of the image.
  • the filters applied on the image's histogram may enable separation of face from background, as known in the art.
  • the image is cropped 1202 , and thus background is significantly removed from the image, as illustrated using FIG. 16 , and described in further detail hereinbelow.
  • the cropped image is resized 1203 into a size, in accordance with a side standard.
  • the size standard may be set by an operator of the first system described in further detail hereinabove.
  • the size standard may improve accuracy and efficiency of identification of the face, since images in a database of face images, which are substantially the same size as the resized image, are more likely be successfully matched with the resized image, for identifying the face, as described in further detail hereinabove.
  • the illumination qualities of the image may be enhanced using Histogram Equalization, which modifies the dynamic range and contrast of an image by altering the image, as known in the art.
  • the histogram equalization employs a monotonic, non-linear mapping, which re-assigns the intensity values of pixels in the image, such that the improved image contains a uniform distribution of intensities (i.e. a flat histogram).
  • Histogram Equalization is usually introduced using continuous (rather than discrete) process functions, as known in the art.
  • the histogram equalization employs linear mapping, exponential (or logarithmic) mapping, etc., as known in the art.
  • data which includes at least a part of the image is sent 1220 over a network (say a wide area network, such as the internet).
  • the face in the image is identified 1220 , say using the face identifier 620 , as described in further detail hereinabove.
  • an image of the face is captured again 1200 , as described in further detail hereinabove.
  • the symmetry criterion may be based on an intensity map of the image, a phase map of the image, a texture map of the image, a statistical model run over images previously received (say by comparison with an average image calculated from of previously received images, which is likely to be symmetric), a comparison made between the image and one or more images previously captured from the same user, etc., as described in further detail hereinabove.
  • the symmetry criterion may be predefined before the images are stored in a face database, as described in further detail hereinabove.
  • the face database there are stored images of known faces, which also meet the face criterion, as described in further detail hereinbelow.
  • the symmetry criterion is enforced on all face images the method is used on.
  • the symmetry criterion may improve accuracy and efficiency of identification of the face in the image.
  • the face is aligned into a position where the face appears symmetric (say a position where an individual looks straight into a camera).
  • the uniform face alignment may ease identification of a face in a new image, through comparison with images in the face database.
  • the identification may be eased, since the uniform face alignment may increase similarity between face images of the same individual, especially as far as two dimensional (2D) images are concerned.
  • FAR False Acceptance Rate
  • FRR False Rejection Rate
  • FIG. 13 is a flowchart illustrating a sixth method for face recognition, according to an exemplary embodiment of the present invention.
  • a sixth method for face recognition there is verified 1310 the compliance of a face with a predefined criterion, as described in further detail hereinabove.
  • symmetry of a face in one or more image(s), is verified 1310 according to a symmetry criterion, say using the face verifier 710 , as described in further detail hereinbelow.
  • the symmetry criterion may be defined by a user of the seventh system, as described in further detail hereinbelow.
  • the sixth method further includes using an intensity map, for verifying 1310 the symmetry of the face in the image, as described in further detail hereinbelow.
  • the sixth method further includes using a gradient map, for verifying 1310 the symmetry of the face in the image, as described in further detail hereinbelow.
  • the sixth further includes using a fast Fourier Transform (FFT) phase map, for verifying 1310 the symmetry of the face in the image, as described in further detail, hereinbelow.
  • FFT fast Fourier Transform
  • the image's forwarding is controlled 1370 (say by the face verifier 710 , as described in further detail hereinabove).
  • the sending of data which includes the image (or a part of the image), over the internet (or another wide area network) may be blocked.
  • FIG. 14 is a flowchart illustrating a seventh method for face recognition, according to an exemplary embodiment of the present invention.
  • a seventh method uses an intensity map of a image captured, say by an image capturer (a still camera, a video camera, etc.)
  • the face is found 1401 in the image, as described in further detail hereinabove.
  • the image is cropped 1402 , say 15 % in each side (top, bottom, right and left), along a rectangle, as described in further detail, and illustrated using FIG. 16 hereinbelow.
  • the cropped image is resized 1403 , say to hundred on hundred pixels.
  • the image is modified, using histogram equalization 1404 (say Linear Histogram Equalization), as described in further detail hereinabove.
  • histogram equalization 1404 say Linear Histogram Equalization
  • the image is divided 1405 into equal parts: a left side and a right side, along a vertical line passing through a point in the middle of the image.
  • an average pixel intensity is calculated 1406 using all pixels of the right part, denoted hereinbelow as: Right Avg.
  • an average intensity is calculated 1406 using all pixels of the left part, denoted hereinbelow as: Left Avg.
  • the new pixel values P new(i, j) form a new image, which comprises the new values calculated for the pixels of the left side, and the original values of the pixels of the right side.
  • the new image is denoted hereinbelow as: I new .
  • the new image I new is flipped 1408 over a central vertical line (i.e. a line which divides the new image into two equal parts, at the image's center), to form a flipped image denoted hereinbelow as I flipped .
  • a central vertical line i.e. a line which divides the new image into two equal parts, at the image's center
  • FIG. 15 is a flowchart illustrating an eighth method for face recognition, according to an exemplary embodiment of the present invention.
  • An eighth method uses a phase map of an image captured (say by an image capturer (a still camera, a video camera, etc.)
  • the phase map may be calculated using Fourier Transform (FT), as known in the art.
  • FT Fourier Transform
  • the face is found 1501 in the image, as described in further detail hereinabove.
  • the image is cropped 1502 , say 15% in each side (top, bottom, right and left), along a rectangle, as described in further detail, and illustrated using FIG. 16 hereinbelow.
  • the cropped image is resized 1503 , say to hundred on hundred pixels.
  • the image is modified, using histogram equalization 1504 (say Linear Histogram Equalization), as described in further detail hereinabove.
  • histogram equalization 1504 say Linear Histogram Equalization
  • the image is divided 1505 into equal parts: a left side and right side, along a vertical line.
  • Symmetry ⁇ Diff Number ⁇ ⁇ of ⁇ ⁇ pixels ⁇ ⁇ of ⁇ ⁇ half ⁇ ⁇ image ⁇ Formula ⁇ ⁇ 5
  • FIG. 16 illustrates cropping of an image of a face, according to an exemplary embodiment of the present invention.
  • an image of a face may be cropped, say 15% of each size, a long a rectangle. Consequently the background is significantly removed from the image.
  • the cropping of the image may result in a more efficient and accurate face recognition, as the identifying is carried out on the face 1611 itself, without unnecessary processing of background details, such as a collar 1612 , which have nothing to do with the face itself.
  • the removal of the background details may also ease identification of a face, by introducing increased similarity between face images of the same individual, especially as far as two dimensional (2D) images are concerned.
  • the systems may include, but are not limited to: 2D or 3D systems, security system, access control, HLS (Home Land Security), ATM (Automatic Teller Machines), web portals, or any application which requires recognition of the subject.
  • HLS Home Land Security
  • ATM Automatic Teller Machines
  • the systems may also include: passport picture capturing, standard image capturing (thus enforcing a standard for image capturing, say for e-Passport or e-ID generation, as known in the art).
  • the systems described in further detail hereinabove may be implemented using a Personal Computer, an embedded system, a FPGA (Field Programmable Gate Array), or any other computing device.
  • a Personal Computer an embedded system
  • FPGA Field Programmable Gate Array
  • FIGS. 17A , 17 B, and 17 C illustrate a face recognition scenario, according to an exemplary embodiment of the present invention.
  • a user approaches a face recognition system, say a face recognition system based on the seventh system described in further detail hereinabove.
  • the user may be asked to get closer to a camera (say using a message displayed on a video monitor), as illustrated in FIG. 17A .
  • an image of the user's face is captured by the camera.
  • the face verifier 710 finds the face in the image to be non-symmetric, the user is asked to look straight into the camera (say using a message displayed on a video monitor), as illustrated in FIG. 17B .
  • the camera captures a second image of the user who looks straight into the camera.
  • the face verifier 710 verifies that the user's face in the second image are indeed symmetric, as described in further detail hereinabove.
  • data which includes the second image (or featured extracted from the second image) is forwarded to the face identifier 720 , which identifies the user.
  • the data may be forwarded over a wide area network 715 , say the internet, as described in further detail hereinabove.
  • a relevant message is presented to the user, say a welcome message, as illustrated in FIG. 17C .

Abstract

A networked system for face recognition, the system comprising: a face verifier, configured to verify compliance of a face in an image with at least one predefined criterion, and upon successful verification of the compliance, to forward the image for feature extraction, a feature extractor, associated with the face verifier, configured to extract a feature from the forwarded image; and a face identifier, communicating with the feature extractor over a network, configured to receive the extracted feature and identify the face in the forwarded image, using the extracted feature.

Description

    FIELD AND BACKGROUND OF THE INVENTION
  • The present invention relates to face authentication and recognition and, more particularly, but not exclusively to a networked system for automatic and remote face authentication and recognition.
  • In recent years, identity theft has become one of the fastest growing crimes in the world.
  • Identity theft is a criminal fraud that involves someone pretending to be someone else in order to steal money or get other benefits. A person whose identity is used can suffer various consequences when he or she is held responsible for the perpetrator's actions.
  • Identity theft includes, but is not limited to business/commercial identity theft (using another's business name to obtain credit), criminal identity theft (posing as another when apprehended for a crime), financial identity theft (using another's identity to obtain goods and services), identity cloning (using another's information to assume his or her identity in daily life), and medical identity theft (using another's information to obtain medical care, drugs, or access to sensitive medical records).
  • In many countries specific laws make it a crime to use another person's identity for personal gain. However, neither laws nor traditional authentication methods (such as passwords or identity cards) have proved useful in preventing identity theft by sophisticated criminals.
  • Many institutions turned to biometric methods for prevention of identity theft crimes.
  • For example, U.S. Pat. No. 5,930,804, to Yu et al., filed on Jun. 9, 1997, describes a method for biometric authentication of individuals involved in transactions employing the Internet.
  • Many governments and international organizations have chosen face recognition as a primary biometric identification method, to base systems for prevention of identity theft crimes on, as well as for other cases where authentication of a person's identity is crucial, say for controlling access to classified information.
  • In recent years, automatic face recognition is in rapid growth, due to computational and algorithmic improvements, growth in need for authentication or verification in the “global village” and the need for preventing frauds. Given a growing need to ease password management, and to use control service access (such as web bank accounts access, medical personal information access, and native access control services), border control and ID services, face recognition has become one of the promising and preferred technologies. The growing popularity of face recognition also stems from the non-intrusiveness of face recognition, and from face recognition's being easy to use and relatively free of regulative constraints.
  • For example, U.S. Pat. No. 7,050,608, to Dobashi, filed on Mar. 7, 2002, entitled “Face image recognition apparatus”, discloses a face image recognition apparatus. Dobashi's face image recognition apparatus includes a registration information holding section in which a reference feature amount of the face of at least one to-be-recognized person is previously registered.
  • The feature amount of the face is extracted from a face image input via an image input section by use of feature amount extracting section. A recognition section determines the recognition rate between the extracted feature amount and the reference feature amount registered in the registration information holding section. A feature amount adding section additionally registers the feature amount extracted by the feature amount extracting section as a new reference feature amount into the registration information holding section when it is determined that the determined recognition rate is lower than a preset value.
  • U.S. Pat. No. 7,221,809, to Geng, filed on Dec. 17, 2002, entitled “Face recognition system and method”, discloses a method of automatically recognizing a human face. The method described by Geng includes developing a three-dimensional model of a face, and generating a number of two-dimensional images based on the three-dimensional model. The generated two-dimensional images are then enrolled in a database and searched against an input image to identifying the face of the input image.
  • Security screening involves capturing images of people in public places and comparing them to images of persons who are known to pose security risks. One prime example of security screening is its use at airport security checkpoints.
  • For example, U.S. Pat. No. 5,164,992, to Turk, filed on Nov. 1, 1990, entitled “Face Recognition System”, describes a recognition system for identifying members of an audience.
  • The system described by Turk includes an imaging system which generates an image of the audience and a selector module for selecting a portion of the generated image. Turk's system further includes a detection means which analyzes the selected image portion to determine whether an image of a person is present, and a recognition module responsive to the detection means for determining whether a detected image of a person identified by the detection means resembles one of a reference set of images of individuals.
  • U.S. patent application Ser. No. 10/719,792, to Monroe, filed on Nov. 21, 2003, entitled “Method for incorporating facial recognition technology in a multimedia surveillance system”, discloses facial recognition technology integrated into a multimedia surveillance system for enhancing the collection, distribution and management of recognition data by utilizing the system's cameras, databases, monitor stations, and notification systems.
  • U.S. patent application Ser. No. 11/450,581, to Chen et al., filed on Jun. 12, 2006, entitled “Three-dimensional face recognition system and method “, describes a three dimensional (3D) face recognition system.
  • Chen's system has a first data storing module for storing three dimensional (3D) face model data and two dimensional (2D) face image data, an input unit for inputting 3D face model data and 2D face image data, a signal conversion module for converting analog data of the 3D face model data and 2D face image data to digital data, and a second data storing module for storing the digital data.
  • Chen's system further includes a micro-processing module for analyzing geometric characteristics of points in the 3D face model data stored in the first and second data storing module to determine feature points of the 3D face model data, and assigning different weight ratios to feature points. Chen's system further includes a comparison module for comparing the feature points stored in the first and second data storing module, during which different geometric characteristics being given different weight ratios, and calculating relativity between the feature points to obtain a comparison result.
  • SUMMARY OF THE INVENTION
  • According to one aspect of the present invention there is provided a networked system for face recognition, the system comprising: a face verifier, configured to verify compliance of a face in an image with at least one predefined criterion, and upon successful verification of the compliance, to forward the image for feature extraction; a feature extractor, associated with the face verifier, configured to extract a feature from the forwarded image; and a face identifier, communicating with the feature extractor over a network, configured to receive the extracted feature and identify the face in the forwarded image, using the extracted feature.
  • According to a second aspect of the present invention there is provided a networked system for face recognition, the system comprising: a face verifier, configured to verify compliance of a face in an image with at least one predefined criterion, and upon successful verification of the compliance, to send the image over a network; a feature extractor, communicating with the face verifier over the network, and configured to receive the sent image and extract a feature from the received image; and a face identifier, associated with the feature extractor and configured to identify the face in the received image, using the extracted feature.
  • According to a third aspect of the present invention there is provided a networked system for face recognition, the system comprising: a face verifier, configured to verify compliance of a face in an image with at least one predefined criterion, and upon successful verification of the compliance, to send the image over a first network; a feature extractor, communicating with the face verifier over the first network, configured to receive the sent image, extract a feature from the received image, and sent the extracted feature over a second network; and a face identifier, communicating with the feature extractor over the second network, configured to receive the extracted feature and identify the face in the received image, using the extracted feature.
  • According to a fourth aspect of the present invention there is provided a networked system for face recognition, the system comprising: a face verifier, configured to verify compliance of a face in an image with at least one predefined criterion, and upon successful verification of the compliance, to send data comprising at least a part of the image over a network; and a face identifier, communicating with the face verifier over the network, configured to receive the sent data and identify the face, using at least a part of the received data.
  • According to a fifth aspect of the present invention there is provided a networked system for face recognition, the system comprising: a face verifier, configured to verify compliance of a face in an image with at least one predefined criterion, and upon successful verification of the compliance, to send data comprising at least a part of the image over a network; and a face database updater, communicating with the face verifier over the network, and configured to receive the sent data and update a face database with at least a part of the received data.
  • According to a sixth aspect of the present invention there is provided a networked system for face recognition, the networked system comprising: a face verifier, configured to verify compliance of a face in an image with at least one predefined criterion, and restrict forwarding of data comprising at least a part of the image over a network, according to results of the verification of the compliance.
  • According to a seventh aspect of the present invention there is provided a method for face recognition, the method comprising: a) verifying compliance of a face in an image with a predefined criterion; b) upon the verifying of the compliance being successful, sending data comprising at least a part of the image over a network, for identification; and c) identifying the face in the image, using at least a part of the sent data.
  • According to an eighth aspect of the present invention there is provided a method for face recognition, the method comprising: a) verifying compliance of a face in an image with a predefined criterion; b) upon the verifying of the compliance being successful, sending data comprising at least a part of the image over a network; and c) updating a database of images with at least a part of the sent data.
  • According to a ninth aspect of the present invention there is provided a method for face recognition, the method comprising: a) verifying compliance of a face in an image with a predefined criterion; and b) controlling forwarding of data comprising at least a part of the image through a network, according to a result of the verifying of the compliance.
  • Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The materials, methods, and examples provided herein are illustrative only and not intended to be limiting.
  • Implementation of the method and system of the present invention involves performing or completing certain selected tasks or steps manually, automatically, or a combination thereof.
  • Moreover, according to actual instrumentation and equipment of preferred embodiments of the method and system of the present invention, several selected steps could be implemented by hardware or by software on any operating system of any firmware or a combination thereof. For example, as hardware, selected steps of the invention could be implemented as a chip or a circuit. As software, selected steps of the invention could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system. In any case, selected steps of the method and system of the invention could be described as being performed by a data processor, such as a computing platform for executing a plurality of instructions.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The invention is herein described, by way of example only, with reference to the accompanying drawings. With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of the preferred embodiments of the present invention only, and are presented in order to provide what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the invention.
  • The description taken with the drawings making apparent to those skilled in the art how the several forms of the invention may be embodied in practice.
  • In the drawings:
  • FIG. 1 is a block diagram illustrating a first networked system for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating a second networked system for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 3 is a block diagram illustrating a third networked system for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 4 is a block diagram illustrating a fourth networked system for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 5 is a block diagram illustrating a fifth networked system for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 6 is a block diagram illustrating a sixth networked system for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 7 is a block diagram illustrating a seventh networked system for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 8 is a flowchart illustrating a first method for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 9 is a flowchart illustrating a second method for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 10 is a flowchart illustrating a third method for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 11 is a flowchart illustrating a fourth method for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 12 is a flowchart illustrating a fifth method for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 13 is a flowchart illustrating a sixth method for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 14 is a flowchart illustrating a seventh method for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 15 is a flowchart illustrating an eighth method for face recognition, according to an exemplary embodiment of the present invention.
  • FIG. 16 illustrates cropping of an image of a face, according to an exemplary embodiment of the present invention.
  • FIGS. 17 a, 17 b, and 17 c illustrate a face recognition scenario, according to an exemplary embodiment of the present invention.
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • The present embodiments comprise a networked system and method for recognizing a face in one or more images (say a still image, a sequence of video images, etc.). The system may be implemented on a wide area network such as the Word Wide Web (i.e. the internet), on an intranet network, etc., as described in further detail herein below.
  • According to an exemplary embodiment of the present invention, a database of faces of known individuals (say criminals or authorized users of a classified information system) is used to store images of the faces of the known individuals, or feature extracted from the image (say a biometric stamp), as described in further detail hereinbelow.
  • In one example, a face of a user of computer station remote from the database of faces is captured in an image (say by a still video camera). A client module installed on the remote station, selectively sends the image (or data which includes one or more features extracted from the image) over the internet, for storage on the remote face database.
  • The face as captured in the image has to comply with a criterion defined in advance, before the captured image is forwarded for storage on the database, as described in further detail hereinbelow.
  • Optionally, the criterion pertains to a statistical model run over previously received images. For example, the criterion may be based on degree of deviation of the captured image (and thus the face in the image) from an average image, as known in the art. The average image is calculated from the previously received images, using known in the art methods. In the average image, each pixel's intensity equals an average of intensities of pixels in the same position in the previously received images.
  • Optionally, the criterion is based on a comparison made between the image and one or more images previously captured from the same user. That is to say that the face of the user in the captured image is compared with the face of the same user, as captured in previously received image(s), or with an average image calculated from previously received images of the same user, which thus bears an average of the face, as described in further detail hereinabove.
  • Optionally, the criterion is a symmetry criterion. For example, the face as captured in the image may have to be successfully verified as symmetric before the image (or data) is sent, as described in further detail hereinbelow.
  • Optionally, the symmetry criterion is based on symmetry of a polygon, which connects selected parts of the captured image.
  • Optionally, the selected parts are known elements of a human face (say nose, eyes, or mouth). The known elements may be identified in the captured image using known in art techniques, such as: Viola-Jones algorithms, Neural Network methods, etc., as known in the art. The centers of the known face elements identified in the captured image are connected to form a polygon, and a verified symmetry of the polygon serves as an indication for the symmetry of the face in the captured image.
  • For example, the centers of the right eye, left eye, and nose, in the captured image, may be connected to form a triangle, which is expected to be isosceles, and thus symmetric. A successful verification of the triangle as isosceles (say by a comparison made between the triangle's arms) indicates that the face captured in the image is indeed symmetric. Similarly, the centers of the eyes and edges of lips in the captured image may be connected to form a trapezoid, which is expected to be symmetric, etc.
  • Optionally, the selected parts are segments of the face in the image. The segments are identified in the captured image, using known in the art image segmentation methods, such as Feature Oriented Flood Fill, Texture Analysis, Principal Component Analysis (PCA) based methods, DFT (Discrete Fourier Transform) methods (i.e. harmonic methods), etc., as known in the art.
  • The mass centers of the selected segments (say segments positioned in parts of the image expected to include known parts of the face, say nose, lips, or mouth) in the captured image are connected to form a polygon. A verified symmetry of the polygon serves as an indication for the symmetry of the face, as described in further detail hereinabove.
  • Optionally, the symmetry criterion is applied on a map representation of the image. The map representation may include, but is not limited to: an intensity map, a phase map, a texture map (i.e. gradient map), or any other map generated from the image using standard image processing filters, as known in the art.
  • The symmetry criterion may be defined before the images are stored in the face database, as described in further detail hereinbelow.
  • Optionally, the symmetry criterion is formulated as a threshold value for symmetry, as known in the art. The threshold value may be a theoretical value based on theoretical calculations, an empirical value derived from experimental data, etc., as known in the art.
  • When a face in a new image (say a face of an individual who uses the computer station and wishes to be granted access to a classified information system) needs to be identified, the face in the new image is tested with respect to the criterion, say the symmetry of the face, as described in further detail hereinabove. That is to say that the face has to comply with the criterion before an attempt is made at identifying the face, say by attempting to match between the captured image and images in the remote database of faces.
  • Thus, according to exemplary embodiments of the present invention, a predefined criterion is enforced on all faces identified in images, using the methods and systems taught hereinbelow.
  • The predefined criterion may improve accuracy and efficiency of identification of the face in the image.
  • For example, in order to meet the symmetry criterion, an individual may be asked to have his face aligned into a position where the face appears symmetric (say a position where the individual looks straight into a camera), as described in further detail hereinbelow.
  • Consequently, there is produced a significantly uniform face alignment amongst the images.
  • The uniform face alignment may ease identification of a face in a new image, through comparison with images in the face database. The identification may be eased, since the uniform face alignment may increase similarity between face images of the same individual, especially as far as two dimensional (2D) images are concerned.
  • Consequently, false face recognition rates, such as FAR (False Acceptance Rate) and FRR (False Rejection Rate), may be significantly reduced.
  • Further, when an individual has to align his face into the position where the individual's face appears symmetric, the individual is less likely to use extreme facial expressions. Extreme facial expressions (such as a widely opened mouth) are known to posses a problem, as far as face recognition (i.e. identification) is concerned.
  • The principles and operation of a system and method according to the present invention may be better understood with reference to the drawings and accompanying description.
  • Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of the components as set forth in the following description or illustrated in the drawings. The invention is further capable of other embodiments or of being practiced or carried out in other ways. Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description only, and should not be regarded as limiting.
  • Reference is now made to FIG. 1, which is a block diagram illustrating a first networked system for face recognition, according to an exemplary embodiment of the present invention.
  • The first networked system for face recognition includes a face verifier 110.
  • Optionally, the face verifier 110 is implemented on a client computer, say on a computer associated with an ATM (Automatic Teller Machine) or on an end station of a Passenger Authentication System, as described in further detail hereinbelow
  • The face verifier 110 verifies compliance of a face in one or more image(s), (say a still video image of a face of an individual, a sequence of video images of an individual, etc.), with a predefined criterion. The predefined criterion may pertain to a statistical model run over previously received images, a comparison made between the image and one or more images previously captured from the same user, symmetry, etc., as described in further detail hereinabove.
  • In one example, the criterion may be a symmetry criterion defined by a user of the first system, say using a Graphical user Interface (GUI) implemented as a part of the face verifier 110, as known in the art.
  • Optionally, the face verifier 110 uses an intensity map, for verifying the symmetry of the face in the image, as described in further detail hereinbelow.
  • Optionally, the face verifier 110 uses a texture map (i.e. gradient map), for verifying the symmetry of the face in the image, as described in further detail hereinbelow.
  • Optionally, the face verifier 110 uses a Fast Fourier Transform (FFT) phase map, for verifying the symmetry of the face in the image, as described in further detail hereinbelow.
  • Optionally, the face is a face of an individual who is a collaborating user.
  • For example, the face may belong to a user who may be asked to move into a better position. The user collaborates by moving into a better aligned position (say a position where the user looks directly into a still camera). A new image of the user's face, as captured from the better aligned position, may be more symmetric, as described in further detail hereinbelow.
  • Optionally, the images are a part of a video sequence, and the video sequence is continuously fed to the face verifier 110, say from a surveillance system (such as a video camera which continuously captures images of a secure area), as described in further detail hereinbelow. The face verifier 110 verifies the symmetry of face in one or more of the images.
  • The first networked system for face recognition further includes a feature extractor 112, in communication with the face verifier 110.
  • Upon successfully verifying that the face complies with the predefined criterion, say the symmetry criterion, the face verifier 110 forwards the image to the feature extractor 112.
  • Optionally, the face verifier 110 measures compliance of the face with the predefined criterion in each image of a video sequence fed to the face verifier 110. Then, the face verifier 110 selects the one or more image(s) of the face amongst the input images, such that the measured compliance of the selected images of the face is highest amongst the input images. The selected images are forwarded to the feature extractor 112.
  • The feature extractor 112 extracts one or more features from the image.
  • Optionally, the extracted features are based on parts of the face which are known to be most invariant under changes of illumination, noise, pose, aging, etc., as known in the art.
  • Optionally, the extracted features may be biometric stamps, as known in art. For example, the feature extractor 112 may use PCA (Principle Component Analysis) Projections, in order to generate a vector which is used as a biometric stamp of the image (i.e. a feature of the image), as known in art.
  • The feature extractor 112 may use one or more feature extraction methods currently known in the art.
  • The feature extraction methods used by the feature extractor 112 may include, but are not limited to: PCA (Principal Component Analysis), ICA (Independent Component Analysis), LDA (Linear Discriminating Analysis), EP (Evolutionary Pursuit), EBGM (Elastic Bunch Graph Matching), Kernel Methods, Trace Transformations, AAM (Active Appearance Model), Three Dimensional Morphable Modeling, Bayesian Frameworks, SVM (Support Vector Machines), HMM (Hidden Markov Models), etc., as known in the art.
  • The first networked system for face recognition further includes a face identifier 120.
  • The face identifier 120 communicates with the feature extractor 112 over a computer network 115.
  • Optionally, the network 115 is a wide area network (say the internet) or an intranet network. An intranet network is an organization's internal or restricted access network that is similar in functionality to the internet, but is only available to the organization internally.
  • Optionally, the face identifier 120 identifies the face in the image, by matching the feature(s) extracted from the image (say a biometric stamp) with one or more features (say biometric stamps) stored in the database, in advance, as described in further detail hereinbelow.
  • For example, the face identifier 120 may use a database of features previously extracted from face images of known individuals, say known criminals. In the database, each feature (or a group of features) is stored associated with data identifying a specific one of the individuals. Exemplary data identifying the individual may include, but is not limited to: name, address, phone numbers, etc.
  • Reference is now made to FIG. 2, which is a block diagram illustrating a second networked system for face recognition, according to an exemplary embodiment of the present invention.
  • The second networked system for face recognition includes a face verifier 210.
  • Optionally, the face verifier 210 is implemented on a client computer, say on a computer associated with an ATM (Automatic Teller Machine), as described in further detail hereinbelow.
  • The face verifier 210 verifies compliance of a face in one or more image(s), (say a still video image of a face of an individual, a sequence of video images of an individual, etc.), with a predefined criterion. The predefined criterion may pertain to a statistical model run over images previously received, a criterion based on a comparison made between the image and one or more images previously captured from the same user, a symmetry criterion, etc., as described in further detail hereinabove.
  • In one example, the criterion may be a symmetry criterion defined by a user of the first system, say using a Graphical user Interface (GUI) implemented as a part of the face verifier 210, as known in the art.
  • The face verifier 210 verifies symmetry of a face in one or more image(s), (say a still video image of a face of an individual, a sequence of video images of an individual, etc.), according to a symmetry criterion.
  • The symmetry criterion may be defined by a user of the second system, as described in further detail hereinbelow.
  • Optionally, the face verifier 210 uses an intensity map, for verifying the symmetry of the face in the image, as described in further detail hereinbelow.
  • Optionally, the face verifier 210 uses a texture map (i.e. gradient map), for verifying the symmetry of the face in the image, as described in further detail hereinbelow.
  • Optionally, the face verifier 210 uses a Fast Fourier Transform (FFT) phase map, for verifying the symmetry of the face in the image, as described in further detail hereinbelow.
  • Optionally, the face is a face of an individual who is a collaborating user, as described in further detail hereinabove.
  • For example, the face may belong to a user who may be asked to move into a better position. The user collaborates by moving into a better aligned position (say a position where the user looks directly into a still camera). A new image of the user's face, as captured from the better aligned position, may be more symmetric, as described in further detail hereinbelow.
  • Optionally, the images are a part of a video sequence, and the video sequence is continuously fed to the face verifier 210, say from a surveillance system, as described in further detail hereinbelow. The face verifier 210 verifies the symmetry of the face in the images of the video sequence.
  • The second networked system for face recognition further includes a feature extractor 218.
  • The face verifier 210 communicates with the feature extractor 218 over a computer network 215.
  • Optionally, the network 218 is a wide area network (say the internet) or an intranet network, as described in further detail hereinabove.
  • Upon successfully verifying the symmetry of the face in the image, the face verifier 210 sends the image to the feature extractor 218, over the network 215.
  • Optionally, the face verifier 210 measures symmetry of each image of a video sequence fed to the face verifier 210. Then, the face verifier 210 selects the one or more image(s) of the face amongst the input images, such that the measured symmetry of the selected images of the face is highest amongst the input images. The face verifier 210 sends the selected images to the feature extractor 218, over the network 215.
  • The feature extractor 218 extracts one or more features from the image.
  • Optionally, the extracted features are based on parts of the face which are known to be most invariant under changes of illumination, noise, pose, aging, etc., as known in the art.
  • The feature extractor 218 may use one or more feature extraction methods currently known in the art. The feature extraction methods used by the feature extractor 218 may include, but are not limited to: PCA (Principal Component Analysis), ICA (Independent Component Analysis), LDA (Linear Discriminating Analysis), EP (Evolutionary Pursuit), EBGM (Elastic Bunch Graph Matching), Kernel Methods, Trace Transformations, AAM (Active Appearance Model), Three Dimensional Morphable Modeling, Bayesian Frameworks, SVM (Support Vector Machines), HMM (Hidden Markov Models), etc., as known in the art.
  • The second networked system for face recognition further includes a face identifier 220, in communication with the feature extractor 218.
  • The face identifier 220 identifies the face in the image, by matching the features extracted from the image with one or more features stored in database, in advance.
  • For example, the face identifier 220 may use a database of features previously extracted from face images of known individuals, say known criminals. In the database, each feature (or a group of features) is stored associated with data identifying a specific one of the individuals. Exemplary data identifying the individual may include, but is not limited to: name, address, phone numbers, etc.
  • The face identifier 220 matches between the feature(s) extracted from the image and feature(s) already stored in the database, and identifies the face as belonging to the individual whose name, address and phone numbers are associated with the feature(s) matched.
  • Reference is now made to FIG. 3, which is a block diagram illustrating a third networked system for face recognition, according to an exemplary embodiment of the present invention.
  • The third networked system for face recognition includes a face verifier 310.
  • Optionally, the face verifier 310 is implemented on a client computer, say on a computer associated with an ATM (Automatic Teller Machine), as described in further detail hereinbelow.
  • The face verifier 310 verifies compliance of a face in one or more image(s), (say a still video image of a face of an individual, a sequence of video images of an individual, etc.), with a predefined criterion. The predefined criterion may pertain to a statistical model run over images previously received, say by calculating an average image, as described in further detail hereinabove. The predefined criterion may be based on a comparison made between the image and one or more images previously captured from the same user, a symmetry criterion, etc., as described in further detail hereinabove.
  • In one example, the criterion may be a symmetry criterion defined by a user of the first system, say using a Graphical user Interface (GUI) implemented as a part of the face verifier 310, as known in the art.
  • The face verifier 310 verifies symmetry of a face in one or more image(s), (say a still video image of a face of an individual, a sequence of video images of an individual, etc.), according to a symmetry criterion.
  • The symmetry criterion may be defined by a user of the fourth system, as described in further detail hereinbelow.
  • Optionally, the face verifier 310 uses an intensity map, for verifying the symmetry of the face in the image, as described in further detail hereinbelow.
  • Optionally, the face verifier 310 uses a texture map (i.e. gradient map), for verifying the symmetry of the face in the image, as described in further detail hereinbelow.
  • Optionally, the face verifier 310 uses a Fast Fourier Transform (FFT) phase map, for verifying the symmetry of the face in the image, as described in further detail hereinbelow.
  • Optionally, the face is a face of an individual who is a collaborating user, as described in further detail hereinabove.
  • For example, the face may belong to a user who may be asked to move into a better position. The user collaborates by moving into a better aligned position (say a position where the user looks directly into a still camera). A new image of the user's face, as captured from the better aligned position, may be more symmetric, as described in further detail hereinbelow.
  • Optionally, the images are a part of a video sequence, and the video sequence is continuously fed to the face verifier 310, say from a surveillance system, as described in further detail hereinbelow. The face verifier 310 verifies the symmetry of face in one or more of the images.
  • The third networked system for face recognition further includes a feature extractor 317.
  • The face verifier 310 communicates with the feature extractor 317 over a first computer network 315.
  • Optionally, the first network 315 is a wide area network (say the internet) or an intranet network, as described in further detail hereinabove.
  • Upon successfully verifying the symmetry of the face in the image, the face verifier 310 sends the image to the feature extractor 317, over the first network 315.
  • Optionally, the face verifier 310 measures symmetry of each image of a video sequence fed to the face verifier 310. Then, the face verifier 310 selects the one or more image(s) of the face amongst the input images, such that the measured symmetry of the selected images of the face is highest amongst the input images. The face verifier 310 sends the selected images the feature extractor 317, over the first network 315.
  • The feature extractor 317 extracts one or more features from the image.
  • Optionally, the extracted features are based on parts of the face which are known to be most invariant under changes of illumination, noise, pose, aging, etc., as known in the art.
  • The feature extractor 317 may use one or more feature extraction methods currently known in the art. The feature extraction methods used by the feature extractor 317 may include, but are not limited to: PCA (Principal Component Analysis), ICA (Independent Component Analysis), LDA (Linear Discriminating Analysis), EP (Evolutionary Pursuit), EBGM (Elastic Bunch Graph Matching), Kernel Methods, Trace Transformations, AAM (Active Appearance Model), Three Dimensional Morphable Modeling, Bayesian Frameworks, SVM (Support Vector Machines), HMM (Hidden Markov Models), etc., as known in the art.
  • The third networked system for face recognition further includes a face identifier 320.
  • Optionally, the face identifier 320 communicates with the feature extractor 317 over a second computer network 319.
  • Optionally, the second network 319 may be the same network as the first network 315 (that is to say that the face verifier 310, the feature extractor 317, and the face identifier 320, are all connected by the same network, say the internet).
  • Optionally, the second network 319 is another network, be it an intranet network, the internet, or another wide area network, as described in further detail hereinabove.
  • The face identifier 320 identifies the face in the image, by matching the features extracted from the image with one or more features stored in database, in advance.
  • For example, the face identifier 320 may use a database of features previously extracted from face images of known individuals, say known criminals. In the database, each feature (or a group of features) is stored associated with data identifying a specific one of the individuals. Exemplary data identifying the individual may include, but is not limited to: name, address, phone numbers, etc.
  • Reference is now made to FIG. 4, which is a block diagram illustrating a fourth networked system for face recognition, according to an exemplary embodiment of the present invention.
  • The fourth networked system for face recognition includes a face verifier 410.
  • Optionally, the face verifier 410 is implemented on a client computer, say on a computer associated with an ATM (Automatic Teller Machine), an end station at an entrance of a secure area, etc.
  • Optionally, the face verifier 410 is implemented on an end station of a Passenger Authentication System. The station is deployed by the entrance of a plane and used for ensuring that only a person granted a boarding pass boards the plane (and not an impostor), etc.
  • The face verifier 410 verifies compliance of a face in one or more image(s), (say a still video image of a face of an individual, a sequence of video images of an individual, etc.), with a predefined criterion. The predefined criterion may pertain to a statistical model run over images previously received, a criterion based on a comparison made between the image and one or more images previously captured from the same user, a symmetry criterion, etc., as described in further detail hereinabove.
  • In one example, the criterion may be a symmetry criterion defined by a user of the fourth system, say using a Graphical user Interface (GUI) implemented as a part of the face verifier 410, as known in the art.
  • The face verifier 410 verifies symmetry of a face in one or more image(s), (say a still video image of a face of an individual, a sequence of video images of an individual, etc.), according to a symmetry criterion.
  • The symmetry criterion may be defined by a user of the fourth system, as described in further detail hereinbelow.
  • Optionally, the face verifier 410 uses an intensity map, for verifying the symmetry of the face in the image, as described in further detail hereinbelow.
  • Optionally, the face verifier 410 uses a texture map (i.e. gradient map), for verifying the symmetry of the face in the image, as described in further detail hereinbelow.
  • Optionally, the face verifier 410 uses a Fast Fourier Transform (FFT) phase map, for verifying the symmetry of the face in the image, as described in further detail.
  • Optionally, the face is a face of an individual who is a collaborating user.
  • For example, the face may belong to a user who may be asked to move into a better position. The user collaborates by moving into a better aligned position (say a position where the user looks directly into a still camera). A new image of the user's face, as captured from the better aligned position, may be more symmetric, as described in further detail hereinbelow.
  • Optionally, the images are a part of a video sequence, and the video sequence is continuously fed to the face verifier 410, say from a surveillance system (such as a video camera which continuously captures images of a secure area), as described in further detail hereinbelow.
  • The face verifier 410 verifies the symmetry of face in each of the images.
  • Optionally, when the face verifier 410 successfully verifies the symmetry of the face in one of the images, the face verifier 410 sends data to a remote face identifier 420, as described in further detail hereinbelow.
  • Optionally, the sent data includes the whole image.
  • Alternatively, the sent data includes only a part of the image. For example, the face verifier 410 may extract one or more features from the image, using known in the art feature extraction methods, as described in further detail hereinabove. Following the extraction, the data which includes the features extracted from the image is sent to a remote face identifier 420, as described in further detail hereinabove.
  • Optionally, the face verifier 410 measures symmetry of each one of two or more images of the video sequence fed to the face verifier 410.
  • Then, the face verifier 410 selects one or more image(s) of the face amongst the input images, such that the measured symmetry of the selected images of the face is highest amongst the input images. Consequently, data which includes at least a part of each of the selected images is sent to a remote face identifier 420, over a network, as described in further detail hereinbelow.
  • The fourth system further includes a remote face identifier 420.
  • The face verifier 410 communicates with the face identifier 420 over a computer network 415.
  • Optionally, the network 415 is a wide area network (say the internet) or an intranet network.
  • The face identifier 420 identifies the face. The face identifier 420 may use any of currently used face identification methods, for identifying the face, as described in further detail hereinabove.
  • For example, the face identifier 420 may receive data, which includes the whole image (or a part from the image), from the 410.
  • The face identifier 420 may extract one or more features from the image (or from the part of the image).
  • Optionally, the face identifier 420 identifies the face in the image sent by the face verifier 410, by matching the extracted features with feature data stored in a face database 450, in advance of the matching.
  • The feature data is stored in the face database 450, together with personal dataidentifying individuals. Upon successful matching of the features extracted from the received data and feature data stored in the face database 450, the face identifier 420 identifies the face in the image sent by the face verifier 410, as belonging to an individual having the personal data associated with the matched feature data.
  • Optionally, the fourth system further includes an image capturer, connected to the face verifier 410. The image capturer may include, but is not limited to a digital still camera, a video camera, a web camera, etc.
  • The image capturer captures the image(s) of the face, and forwards the captured image(s) to the face verifier 410.
  • Optionally, when the face verifier 410 finds the face in the image non-symmetric (say when the face fails to meet the symmetry criterion), the face verifier 410 instructs the image capturer (say the digital still camera) to capture a new image of the face.
  • Optionally, upon finding the face non-symmetric, the face verifier 410 presents an appropriate message (say a message asking an individual whose face image is captured to look straight into the image capturer, etc.), and the face capturer captures a new image of the face, as described in further detail hereinbelow.
  • Optionally, the fourth system further includes a face detector, in communication with the face verifier 410.
  • The face detector detects the face in the image. The face detector may use one or more known in the art methods for detecting the face in the image, including, but not limited to: a skin detection method, a Viola-Jones detection method, a Gabor Filter based method, etc., as described in further detail hereinbelow.
  • Optionally, the fourth system further includes an image cropper, connected to the face verifier 410.
  • The image cropper crops the image, and thereby significantly removes background from the image.
  • Optionally, the image cropper crops the image around the face, leaving a purely facial image (i.e. an image which includes only the face, without background details).
  • Optionally, the image cropper crops the image, along a rectangle, as illustrated using FIG. 16, and described in further detail hereinbelow.
  • Optionally, the fourth system also includes an image resizer, in communication with the face verifier 410.
  • The image resizer resizes the image into a predefined size, and thereby standardizes the image's size according to a size standard predefined by a user of the fourth system, as described in further detail hereinbelow. The size standard may improve accuracy and efficiency of a face identifier 420, as described in further detail hereinbelow.
  • Optionally, the fourth system further includes an image illumination quality improver, in communication with the face verifier 410.
  • The image illumination quality improver may improve one (or more) qualities of illumination of the image, say using Histogram Equalization, as known in the art and described in further detail hereinbelow.
  • Reference is now made to FIG. 5, which is a block diagram illustrating a fifth networked system for face recognition, according to an exemplary embodiment of the present invention.
  • A fifth networked system for face recognition includes a face verifier 510.
  • Optionally, the face verifier 510 is implemented on a client computer, say on a computer associated with an ATM (Automatic Teller Machine) or on an end station of a Passenger Authentication System, as described in further detail hereinabove.
  • The face verifier 510 verifies compliance of a face in one or more image(s), (say a still video image of a face of an individual, a sequence of video images of an individual, etc.), with a predefined criterion. The predefined criterion may pertain to a statistical model run over images previously received, a criterion based on a comparison made between the image and one or more images previously captured from the same user, a symmetry criterion, etc., as described in further detail hereinabove.
  • In one example, the criterion may be a symmetry criterion defined by a user of the fifth system, say using a Graphical user Interface (GUI) implemented as a part of the face verifier 510, as known in the art.
  • The face verifier 510 verifies symmetry of a face in one or more image(s), say a sequence of video images of an individual, according to the symmetry criterion.
  • The symmetry criterion may be based on an intensity map, a phase map, a texture map, etc., as described in further detail hereinbelow.
  • Optionally, the face verifier 510 uses an intensity map, a gradient map, a Fast Fourier Transform (FFT) phase map, or a combination thereof, for verifying the symmetry of the face in the image(s), as described in further detail hereinbelow.
  • Optionally, the face verifier 510 measures symmetry of each one of two or more input images (say images which are a part of a sequence of video images, or a video stream). Then, the face verifier 510 selects the one or more image(s) of the face amongst the input images, such that the measured symmetry of the selected images of the face is highest amongst the input images.
  • The face verifier 510 may further receive data identifying the face from a user, say using a user interface implemented as a part of the face verifier 510, or a user interface in association therewith, as known in the art. The user may be an operator of the fifth system, the person whose face is captured in the image(s), etc. The data identifying face may include, but is not limited to details such as a passport number, a name, or an address.
  • The fifth system further includes a face database updater 530.
  • The face verifier 510 communicates with the face database updater 530 over a computer network 515.
  • Optionally, the network 515 is a wide area network (say the internet) or an intranet network. For example, the intranet network 515 may connect computers and ATMs (Automatic Teller Machines) in one or more branches and offices of a commercial bank.
  • When the face verifier 510 successfully verifies the symmetry of the face in one of the images (say the face of a criminal), the face verifier 510 sends data, over the network 515, to the face database updater 530. The sent data may include the whole image, or a part of the image, say one or more features extracted from the image, such as biometric stamps, as described in further detail hereinabove.
  • Optionally, the sent data further includes data identifying the face, as described in further detail hereinbelow.
  • The face database updater 530 updates a face database 550 with the received data, or a part thereof, associated with the data identifying the face, as described in further detail hereinbelow.
  • Optionally, the received data includes only one or more features extracted from the face (i.e. a part of the image).
  • Alternatively, the received data includes the whole image (or a significant part thereof). The face database updater 530 extracts one or more features from the image (or from the significant part), and stores the extracted features in the face database 550.
  • The face database 550 may be a local database, a remote database accessible through the Internet, etc., as known in the art.
  • Reference is now made to FIG. 6, which is a block diagram illustrating a sixth networked system for face recognition, according to an exemplary embodiment of the present invention.
  • A sixth networked system for face recognition includes a face verifier 610.
  • The face verifier 610 verifies compliance of a face in one or more image(s), (say a still video image of a face of an individual, a sequence of video images of an individual, etc.), with a predefined criterion. The predefined criterion may pertain to a statistical model run over images previously received, a criterion based on a comparison made between the image and one or more images previously captured from the same user, a symmetry criterion, etc., as described in further detail hereinabove.
  • In one example, the criterion may be a symmetry criterion defined by a user of the sixth system, as described in further detail hereinabove.
  • The face verifier 610 receives one or more first image(s) of a face, together with data identifying the face, as described in further detail hereinabove.
  • The data identifying the face may include, but is not limited to details such as a passport number, a name, or an address. The details may be provided by an operator of the sixth system, by an individual whose face is captured in the image, etc.
  • The face verifier 610 verifies symmetry of a face in one or more of the first image(s), according to a symmetry criterion.
  • For example, the first image(s) may include a still video image of a face of an individual who enrolls in a security system, or a sequence of video images of a known criminal the police wishes to store in a database of criminal suspects.
  • The symmetry criterion may be defined by a user of the sixth system, say using a Graphical User Interface (GUI), as described in further detail hereinbelow. The symmetry criterion may be based on an intensity map, a phase map, a texture map, etc., as described in further detail hereinbelow.
  • Optionally, the face verifier 610 uses an intensity map, a gradient map, a Fast Fourier Transform (FFT) phase map, or a combination thereof, for verifying the symmetry of the face in the first image(s), as described in further detail hereinbelow.
  • Optionally, the face verifier 610 measures symmetry of each one of two or more images input to the face verifier 610, say images which are a part of a sequence of video images streamed to the face verifier 610. Then, the face verifier 610 selects one or more first image(s) of the face amongst the input images, such that the measured symmetry of the selected image(s) of the face is highest amongst the input image(s).
  • The sixth system further includes a face database updater 630,
  • The face verifier 610 communicates with the face database updater 630 over a computer network 615.
  • Optionally, the network 615 is a wide area network (say the internet) or an intranet network, as described in further detail hereinabove.
  • When the face verifier 610 successfully verifies the symmetry of the face in one (or more) of the first images (say the face of a criminal), the face verifier 610 sends data over the network 615, including data identifying the face.
  • Optionally, the sent data includes the whole of the first image(s).
  • Alternatively, the sent data includes only a part of each of the first image(s).
  • For example, the data may include one or more features extracted from each of the first image(s), by the face verifier 610, say a biometric stamp extracted from the first image, as described in further detail hereinabove.
  • The face database updater 630 updates a face database 650 with the received data or with features extracted from the received data, as described in further detail hereinbelow.
  • Optionally, the face database updater 630 updates the face database 650 with the images selected by the face verifier 610 or with data extracted from the selected images, as described in further detail hereinabove.
  • The sixth system further includes a face identifier 620.
  • The face verifier 610 communicates with the face identifier 620 over the computer network 615, say over the intranet, as described in further detail hereinabove.
  • When one or more second image(s) of a face are presented to the face verifier 610 (say, a video stream of an individual who attempts to walk into a secure area), the face verifier 610 verifies the symmetry of the face in the second image, according to the predefined symmetry criterion.
  • If the symmetry of the face in one or more of the second images is successfully verified, the face verifier 610 sends data which includes at least a part of the second image, over the network 615, as described in further detail hereinabove.
  • The face identifier 620 receives the data sent by face verifier 610. The face identifier 620 identifies the face in the second images, say using the face database 650, and the received data, as described in further detail hereinabove.
  • In one example, an authorized user of classified information system enrolls in the classified information system.
  • A first image of the authorized user's face is input to the face verifier 610 (say a passport photo), together with data identifying the authorized user, say using a Graphical User Interface (GUI). The data identifying the user may include, but is not limited to details such as a passport number, a name, an address, a role, etc. The details may be provided by an operator of the sixth system, by the authorized user himself, etc.
  • If the face verifier 610 verifies the symmetry of the authorized user's face in the first image, the face verifier 610 sends data which includes the first image (or a part of the first image) to the database updater 630, over the network 615, together with the data identifying the authorized user. The database updater 630 updates the face database 650 with the received data.
  • The next time the authorized user wishes to log into the classified information system, a second image of his face is captured live, say by a still camera in communication with the classified information system.
  • The face verifier 610 receives the second image and verifies the symmetry of the authorized user's face in the second image.
  • As the symmetry of the authorized user's face in the second image is successfully verified, the face verifier 610 sends data, which includes the second image (or a part of the second image) over the network 615. The face identifier 620 receives the sent data and identifies the face in the second image, using the face database 650, as described in further detail hereinbelow.
  • Consequently, upon positive identification of the authorized user's face, the authorized user is allowed to log into the classified information system.
  • Reference is now made to FIG. 7, which is a block diagram illustrating a seventh networked system for face recognition, according to an exemplary embodiment of the present invention.
  • A seventh networked system for face recognition includes a face verifier 710.
  • The face verifier 710 verifies compliance of a face in one or more image(s), with a predefined criterion. The predefined criterion may pertain to a statistical model run over images previously received, a comparison made between the image and one or more images previously captured from the same user, symmetry, etc., as described in further detail hereinabove.
  • In one example, the criterion may be a symmetry criterion defined by a user of the seventh system, say using a Graphical user Interface (GUI) implemented as a part of the face verifier 710, as known in the art.
  • The face verifier 710 verifies symmetry of a face in one or more image(s) received by the face verifier 710.
  • The images may include, but are not limited to a still video image of a face of an individual, or a sequence of video images of an individual.
  • The face verifier 710 verifies the symmetry according to a symmetry criterion.
  • The symmetry criterion may be based on an intensity map, a phase map, a texture map, etc., as described in further detail hereinbelow.
  • Optionally, the face verifier 710 uses an intensity map, a gradient map, a fast Fourier Transform (FFT) phase map, an image processing filter output (as known in art), or a combination thereof, for verifying the symmetry of the face in the image(s), as described in further detail hereinbelow.
  • The face verifier 710 further restricts forwarding of data, which includes at least a part of the image to a remote receiver over a network 715 (say the internet), according to results of the verification of the symmetry by the face verifier 710.
  • For example, the face verifier 710 may restrict the forwarding of images to a party, who offers face identification services over the internet.
  • In a first example, the face verifier 710 finds the face in the image to be non-symmetric (i.e. the face fails to meet the symmetry criterion). Consequently, the face verifier 710 blocks the forwarding of data which includes the image (or a part thereof) to the image identifier 420 described hereinabove (using FIG. 4), through a network 715 (say the internet), as described in further detail hereinabove.
  • Optionally, the face verifier 710 also presents an appropriate message.
  • For example, the face verifier 710 may present a message asking an individual whose face image is captured to look straight into an image capturer (say, a still camera), or to align in a position in front of the image capturer, as described in further detail hereinbelow.
  • Then, the image capturer may capture a new (and hopefully, symmetric) image of the face of the individual.
  • When the face verifier 710 finds that the face successfully meets the symmetry criterion (and is thus successfully verified), the face verifier 710 forwards data which includes at least a part of the image to the image identifier 420, through the network 715, as described in further detail hereinabove.
  • In a second example, the face verifier 710 may forward the data to one or more destination(s) set in advance of the verification of the symmetry, say by an operator of the seventh system. The destination(s) may include, but are not limited to: an email address, a database server of a third party, or an application (which may run on a remote computer, etc.).
  • The seventh system may be used as a stand alone product, or in combination with other systems, say a face recognition system, a security system, etc.
  • Reference is now made to FIG. 8, which is a flowchart illustrating a first method for face recognition, according to an exemplary embodiment of the present invention.
  • In a first method for face recognition, according to an exemplary embodiment of the present invention, there is verified 810 the compliance of a face with a predefined criterion, as described in further detail hereinabove.
  • For example, the compliance of a face in one or more image(s), may be verified 810 using a predefined criterion. The predefined criterion may pertain to a statistical model run over images previously received, a criterion based on a comparison made between the image and one or more images previously captured from the same user, a symmetry criterion, etc., as described in further detail hereinabove.
  • In one example, the criterion may be a symmetry criterion defined by a user of the fourth system, say using a Graphical user Interface (GUI) implemented as a part of the face verifier 410, as described in further detail and illustrated using FIG. 4 hereinabove.
  • The symmetry of a face in one or more image(s), (say a still video image of a face of an individual, a sequence of video images of an individual, etc.) is automatically verified 810 according to a symmetry criterion, say using the face verifier 410, as described in further detail hereinbelow.
  • The symmetry criterion may be based on an intensity map, a phase map, a texture map, an image processing filter, etc., as described in further detail hereinbelow.
  • Optionally, the first method further includes using an intensity map, for verifying 810 the symmetry of the face in the image, as described in further detail, hereinbelow.
  • Optionally, the first method further includes using a gradient map, for verifying 810 the symmetry of the face in the image, as described in further detail, hereinbelow.
  • Optionally, the first method further includes using a fast Fourier Transform (FFT) phase map, for verifying 810 the symmetry of the face in the image, as described in further detail, hereinbelow.
  • Optionally, the first method further includes measuring symmetry of each one of two or more input images (say images which are a part of a sequence of video images, or a video stream). Then, the one or more image(s) of the face are selected amongst the input images, such that the measured symmetry of the selected images of the face is highest amongst the input images.
  • Upon successful verification 810 of the symmetry of the face in the image, data of the image is sent 815 through a computer network (say the internet), for identification.
  • The data of the image may include the whole image, or a part thereof (say one or more features extracted from the image, such as a biometric stamp, as described in further detail hereinabove).
  • Finally, the face in the image is identified 820, using the data, or features extracted from the data, say by the face identifier 420, as described in further detail hereinabove.
  • Optionally, the first method further includes a preliminary step of capturing the image of the face, and forwarding the captured image, for the symmetry verification, (say to the face verifier 410), as described in further detail hereinabove.
  • Optionally, the first method further includes detecting the face in the image, say using the face detector, as described in further detail hereinbelow.
  • The detection of the face may be carried out using one or more methods, as known in the art, including, but not limited to: a skin detection method, a Viola-Jones detection method, a Gabor Filter based method, etc., as described in further detail hereinbelow.
  • Optionally, the first method further includes cropping the image.
  • Optionally, the cropping may be carried out around the face, thereby leaving a purely facial image (i.e. substantially without background).
  • Optionally, the cropping may be carried out along a rectangle, significantly removing background from the image, as illustrated using FIG. 16, and described in further detail hereinbelow.
  • Optionally, the first method further includes resizing the image into a predefined size, and thereby standardizing the image's size according to a predefined size standard, as described in further detail hereinbelow.
  • Optionally, the first method further includes improving one or more qualities of illumination of the image, say using Histogram Equalization methods, as described in further detail hereinbelow.
  • Reference is now made to FIG. 9, which is a flowchart illustrating a second method for face recognition, according to an exemplary embodiment of the present invention.
  • In a second method for face recognition, according to an exemplary embodiment of the present invention, there is verified 910 the compliance of a face with a predefined criterion, as described in further detail hereinabove.
  • For example, the compliance of a face in one or more image(s) may be verified 910 with a predefined criterion. The predefined criterion may pertain to a statistical model run over images previously received, a criterion based on a comparison made between the image and one or more images previously captured from the same user, a symmetry criterion, etc., as described in further detail hereinabove.
  • In one example, symmetry of a face in one or more image(s), (say a still video image of a face of an individual, a sequence of video images of an individual, etc.) is verified 910 according to a symmetry criterion, say using the face verifier 510, as described in further detail hereinbelow, and illustrated using.
  • The symmetry criterion may be defined by a user of the fifth system described in further detail hereinbelow.
  • Optionally, the second method further includes using an intensity map, for verifying 910 the symmetry of the face in the image, as described in further detail, hereinbelow.
  • Optionally, the second method further includes using a gradient map, for verifying 910 the symmetry of the face in the image, as described in further detail, hereinbelow.
  • Optionally, the second method further includes using a fast Fourier Transform (FFT) phase map, for verifying 910 the symmetry of the face in the image, as described in further detail, hereinbelow.
  • Optionally, the second method further includes measuring symmetry of each one of two or more input images (say images which are a part of a sequence of video images, or a video stream). Then, one or more image(s) of the face are selected amongst the input images, such that the measured symmetry of the selected images of the face is highest amongst the input images.
  • If the symmetry of the face in the image(s) is successfully verified 910 (say by the face verifier 510), data of the selected image(s) is sent 915 to a face database updater 530 in communication with the face verifier 510. The data of the selected images is sent 915 over the computer network 515, say over the internet or an intranet network, as described in further detail hereinabove.
  • The sent data may include the whole image, or a part thereof (say one or more features extracted from the image, as described in further detail hereinabove).
  • Optionally, a face database 550 is updated 930 with the data of the selected image(s) and associated data identifying the face, say by the face database updater 530, as described in further detail hereinabove. Alternatively, the face database 550 is updated with one or more features extracted from the received data, as described in further detail hereinabove.
  • The data identifying face may include, but is not limited to details such as a passport number, a name, or an address. The details may be provided by an operator of the second system described hereinabove, by an individual whose face is captured in the image, etc.
  • Reference is now made to FIG. 10, which is a flowchart illustrating a third method for face recognition, according to an exemplary embodiment of the present invention.
  • In a third method, according to an exemplary embodiment of the present invention, there is verified 1010 the compliance of a face in a first image with a predefined criterion, as described in further detail hereinabove.
  • For example, the compliance of a face in one or more first image(s), may be verified 1010 with a criterion which pertains to a statistical model run over images previously received, a criterion based on a comparison made between the image and one or more images previously captured from the same user, a symmetry criterion, etc., as described in further detail hereinabove.
  • In one example, symmetry of a face in one or more first image(s), (say a still video image of a face of an individual, a sequence of video images of an individual, etc.) is verified 1010 according to a symmetry criterion, say using the face verifier 610, as described in further detail hereinbelow.
  • The symmetry criterion may be defined by a user of the sixth system described in further detail hereinbelow.
  • For example, symmetry of a face in one or more first image(s) is verified 1010, according to a symmetry criterion, say using the face verifier 610, as described in further detail hereinbelow.
  • For example, the first image(s) may include a passport photo of an individual who enrolls in a security system, a sequence of video images of a known criminal the police wishes to store in a database of criminal suspects, etc.
  • The symmetry criterion may be defined by a user of the sixth system, as described in further detail hereinbelow.
  • Optionally, the verification of the symmetry of the first image(s) is carried out using an intensity map, a texture map (i.e. gradient map), a fast Fourier Transform (FFT) phase map, or a combination thereof, as described in further detail hereinbelow.
  • Next, the data of the first images is sent 1030 to a database 650 in communication with the face verifier 610 over a computer network (say the internet). The sent data may include whole images, or a part thereof (say one or more features extracted from the image, as described in further detail hereinabove).
  • Next, the database 650 is updated 1030 with the sent data and associated data identifying the face. The data is sent 1030 (and updated), only if the symmetry of the face in the first image(s) is successfully verified 1010, say by the face verifier 610, as described in further detail hereinabove.
  • The data identifying face may include, but is not limited to details such as a passport number, a name, or an address. The details may be provided by an operator of the sixth system described hereinabove, by the individual whose face is captured in the image, etc.
  • When one or more second image(s) of the face are presented to the face verifier 610 (say, a video stream of a criminal who attempts to walk into a secure area), the symmetry of the face in the second image(s) is verified 1070, according to the predefined symmetry criterion, as described in further detail hereinabove.
  • If the symmetry of the face in the second image(s) is successfully verified 1070, data of the second image(s) is sent 1090 to the face identifier 620, and the face in the second image(s) is identified 1090, say by the face identifier 620, using the face database 650, as described in further detail hereinabove.
  • For example, a police unit may wish to store a face image together with identifying data of a known criminal in a suspect database.
  • The symmetry of the known criminal's face in the first image is verified 1010, say using the face verifier 610, as described in further detail hereinbelow.
  • If the symmetry of the known criminal's face in the first image is successfully verified 1010, the suspect database is updated 1030 with the first image of the known criminal, together with the data identifying the known criminal.
  • A surveillance camera may capture a video stream (i.e. second images of the criminal) of the criminal in a crime scene. The video stream may be used to identify the criminal, say using one of the systems described in further detail hereinabove.
  • When the symmetry of the criminal's face in the second image is successfully verified 1070, the face in the second image may be identified 1090, using the police unit's suspect database.
  • Consequently, upon positive identification of the criminal's face, the police may arrest the known criminal, and use the video stream as evidence against the known criminal.
  • Reference is now made to FIG. 11, which is a flowchart illustrating a fourth method for face recognition, according to an exemplary embodiment of the present invention.
  • In a fourth method, according to an exemplary embodiment of the present invention, there is verified 1110 the compliance of a face with a predefined criterion, as described in further detail hereinabove.
  • For example, the compliance of a face in one or more image(s), may be verified 1110 with a criterion which pertains to a statistical model run over images previously received, a criterion based on a comparison made between the image and one or more images previously captured from the same user, a symmetry criterion, etc., as described in further detail hereinabove.
  • In one example, an image of a face is captured 1100, say using an image capturer, such as a digital still camera, a video camera, or a surveillance camera (which constantly streams video images of a secure area).
  • For example, a user may approach a face recognition system, which includes the sixth system described in further detail hereinabove, as well as the image capturer (say a video camera).
  • Optionally, the image capturer may be triggered to capture the image of a user who approaches the face recognition system by a smart card reader connected to the image capturer.
  • Upon insertion of a smart cart into the smart card reader, by the user, the smart card reader triggers the image capturer, to capture the image of the user. Then, the captured image is forwarded for symmetry verification, as described in further detail hereinbelow.
  • Similarly, the image capturer may be triggered to capture the image of the face of the user who approaches the face recognition system, by a RFID (Radio frequency identification) card reader connected to the image capturer. The RFID card reader triggers the image capturer to capture the image, when the user inserts an RFID cart into the RFID reader. Then, the captured image is forwarded for symmetry verification, as described in further detail hereinbelow.
  • Optionally, the image capturer continuously captures images. For example, the imager capturer may be a surveillance camera, which constantly streams video images of a secure area. Upon detection of the user's face in the image (say by the face detector), the image is forwarded to the face verifier 610, as describe in further detail hereinabove.
  • Optionally, the image is captured in a two dimensional (2D) format, as known in the art.
  • Optionally, the image is captured in a three dimensional (3D) format, as known in the art.
  • Next, the face in the captured image is verified 1110, according to a symmetry criterion, say by the face verifier 610, as described in further detail hereinabove.
  • Optionally, when the face in the image is found to be non-symmetric (i.e.
  • when the face fails to meet the symmetry criterion), the image capturer is instructed (say by the face verifier 610) to capture a new image of the face.
  • The image capturer may present an appropriate message, say a message asking an individual whose face image is captured to look straight into the image capturer, or align in a position in front of the image capturer (say, a still camera), as described in further detail hereinbelow. Then, the image capturer captures a new image of the face.
  • When the symmetry of the face is successfully verified 1110, data which includes the image (or a part of the image) is sent for identification, say over a wide area network, such as the internet, as described in further detail hereinabove.
  • Optionally, before identification the image is pre-processed 1180, say by the face identifier 620, using one of several pre-processing methods currently used for face recognition. The pre-processing methods may be used for sharpening, grey scale modification, removal of red eyes, etc., as known in the art.
  • Optionally, there are extracted one or more features from the image, as described in further detail hereinabove.
  • Finally, the face is identified 1190, say by the face identifier 420, as described in further detail hereinabove.
  • Reference is now made to FIG. 12, which is a flowchart illustrating a fifth method for face recognition, according to an exemplary embodiment of the present invention.
  • In a fifth method, according to an exemplary embodiment of the present invention, a video image is captured 1200, say by a video still camera, as described in further detail hereinabove.
  • Next, there is detected 1201 a face in the captured image.
  • The face may be detected using one or more methods currently used for detecting a face in the image. The methods currently used may include, but are not limited to: Viola-Jones detection methods, Gabor Jets based methods, skin detection methods, histogram analysis methods, or other methods (say methods based on edge maps, gradients, or standard face shapes, etc.), as known in the art).
  • Viola-Jones methods use several image processing filters over the whole image. A neural network algorithm is trained over a training set (say a set of already processed face images). The face is detected using the neural network algorithm and a search is made to find best match values that predict the face center location, as known in the art.
  • Gabor Jets methods use a convolution of Fourier Coefficient of the image with wavelets coefficients of low order, where the values that predict face location are set according to empirically found predictive values, as known in the art.
  • Skin detectors analyze an intensity map presentation of the image, in order to find that pixel intensity values which comply with standard skin values, as known in the art.
  • Histogram analysis methods analyze a histogram of the image, say a Pixel Frequency Histogram, after applying several filters (histogram normalization, histogram stretching, etc.) on the histogram of the image. The filters applied on the image's histogram may enable separation of face from background, as known in the art.
  • Next, the image is cropped 1202, and thus background is significantly removed from the image, as illustrated using FIG. 16, and described in further detail hereinbelow.
  • Then, the cropped image is resized 1203 into a size, in accordance with a side standard. The size standard may be set by an operator of the first system described in further detail hereinabove.
  • The size standard may improve accuracy and efficiency of identification of the face, since images in a database of face images, which are substantially the same size as the resized image, are more likely be successfully matched with the resized image, for identifying the face, as described in further detail hereinabove.
  • Next, there are improved 1204 one or more illumination qualities of the image.
  • For example, the illumination qualities of the image may be enhanced using Histogram Equalization, which modifies the dynamic range and contrast of an image by altering the image, as known in the art.
  • Optionally, the histogram equalization employs a monotonic, non-linear mapping, which re-assigns the intensity values of pixels in the image, such that the improved image contains a uniform distribution of intensities (i.e. a flat histogram).
  • Histogram Equalization is usually introduced using continuous (rather than discrete) process functions, as known in the art.
  • Optionally, the histogram equalization employs linear mapping, exponential (or logarithmic) mapping, etc., as known in the art.
  • Next, the symmetry of the face in the image is verified 1207, using a symmetry criterion, as described in further detail hereinbelow.
  • Upon successful verification 1215 of the symmetry of the face in the image, data which includes at least a part of the image is sent 1220 over a network (say a wide area network, such as the internet). The face in the image is identified 1220, say using the face identifier 620, as described in further detail hereinabove.
  • If the image is found to be non-symmetric (i.e if the image fails to comply with the symmetry criterion), an image of the face is captured again 1200, as described in further detail hereinabove.
  • The symmetry criterion may be based on an intensity map of the image, a phase map of the image, a texture map of the image, a statistical model run over images previously received (say by comparison with an average image calculated from of previously received images, which is likely to be symmetric), a comparison made between the image and one or more images previously captured from the same user, etc., as described in further detail hereinabove.
  • The symmetry criterion may be predefined before the images are stored in a face database, as described in further detail hereinabove.
  • In the face database, there are stored images of known faces, which also meet the face criterion, as described in further detail hereinbelow. Thus, according to exemplary embodiments of the present invention, the symmetry criterion is enforced on all face images the method is used on.
  • The symmetry criterion may improve accuracy and efficiency of identification of the face in the image.
  • For example, in order to meet the face criterion, the face is aligned into a position where the face appears symmetric (say a position where an individual looks straight into a camera).
  • Consequently, there is produced a significantly uniform face alignment amongst the images.
  • The uniform face alignment may ease identification of a face in a new image, through comparison with images in the face database. The identification may be eased, since the uniform face alignment may increase similarity between face images of the same individual, especially as far as two dimensional (2D) images are concerned.
  • Consequently, face recognition rates, such as FAR (False Acceptance Rate) and FRR (False Rejection Rate), may be improved.
  • Further, when an individual has to align his face into the position where the individual's face appears symmetric, the individual is less likely to use extreme facial expression. Extreme facial expressions (such as a widely opened mouth) are known to posses a problem, as far as face recognition (i.e. identification) is concerned.
  • Reference is now made to FIG. 13, which is a flowchart illustrating a sixth method for face recognition, according to an exemplary embodiment of the present invention.
  • In a sixth method for face recognition, according to an exemplary embodiment of the present invention, there is verified 1310 the compliance of a face with a predefined criterion, as described in further detail hereinabove.
  • In one example, symmetry of a face in one or more image(s), (say a still video image of a face of an individual, a sequence of video images of an individual, etc.) is verified 1310 according to a symmetry criterion, say using the face verifier 710, as described in further detail hereinbelow.
  • The symmetry criterion may be defined by a user of the seventh system, as described in further detail hereinbelow.
  • Optionally, the sixth method further includes using an intensity map, for verifying 1310 the symmetry of the face in the image, as described in further detail hereinbelow.
  • Optionally, the sixth method further includes using a gradient map, for verifying 1310 the symmetry of the face in the image, as described in further detail hereinbelow.
  • Optionally, the sixth further includes using a fast Fourier Transform (FFT) phase map, for verifying 1310 the symmetry of the face in the image, as described in further detail, hereinbelow.
  • Next, the image's forwarding is controlled 1370 (say by the face verifier 710, as described in further detail hereinabove).
  • For example, when the face in the image to is found to be non-symmetric, the sending of data which includes the image (or a part of the image), over the internet (or another wide area network) may be blocked.
  • Reference is now made to FIG. 14, which is a flowchart illustrating a seventh method for face recognition, according to an exemplary embodiment of the present invention.
  • A seventh method, according to a preferred embodiment of the present invention uses an intensity map of a image captured, say by an image capturer (a still camera, a video camera, etc.)
  • In the seventh method, the face is found 1401 in the image, as described in further detail hereinabove.
  • Next, the image is cropped 1402, say 15% in each side (top, bottom, right and left), along a rectangle, as described in further detail, and illustrated using FIG. 16 hereinbelow.
  • The cropped image is resized 1403, say to hundred on hundred pixels.
  • Optionally, the image is modified, using histogram equalization 1404 (say Linear Histogram Equalization), as described in further detail hereinabove.
  • Next, there is verified the symmetry of the face in the image, through the following:
  • The image is divided 1405 into equal parts: a left side and a right side, along a vertical line passing through a point in the middle of the image.
  • Next, an average pixel intensity is calculated 1406 using all pixels of the right part, denoted hereinbelow as: Right Avg., and an average intensity is calculated 1406 using all pixels of the left part, denoted hereinbelow as: Left Avg.
  • Next, the left side is transformed 1407. For each old pixel P old (i,j) of of the left size, there is computed a new value using Formula 1, yielding a corresponding new value for the pixel, denoted hereinbelow as Pnew (i,j).
  • P new ( i , j ) = P old ( i , j ) × Right Avg . Left Avg . Formula 1
  • The new pixel values Pnew(i, j) form a new image, which comprises the new values calculated for the pixels of the left side, and the original values of the pixels of the right side. The new image is denoted hereinbelow as: Inew.
  • Next the new image Inew is flipped 1408 over a central vertical line (i.e. a line which divides the new image into two equal parts, at the image's center), to form a flipped image denoted hereinbelow as Iflipped.
  • Then, for each of pixel (I, J) there is computed a difference 1409 between intensity of the pixel in Inew and the intensity of the pixel in Iflipped, using Formula 2:

  • Diff i,j=|I new(i, j)−I flipped(i, j)|  Formula 2
  • The resultant difference is denoted: DiffI,j
  • Next, there is computed 1410 the symmetry of the face by dividing the average of the differences (DiffI,j) of intensities of the pixels calculated using Formula 2, by the average of intensities of the pixels of Inew, as formulated by Formula 3:
  • Symmetry = Avg ( Diff i , j ) Avg ( I new ) Formula 3
  • According to an exemplary embodiment, the threshold for symmetry (i.e. symmetry criterion) is set at 0.35. If symmetry<0.35, the face is successfully verified 1411 as symmetric. If symmetry>=0.35, the face is determined to be non-symmetric, and a new image has to be captured, as described in further detail hereinabove.
  • Reference is now made to FIG. 15, which is a flowchart illustrating an eighth method for face recognition, according to an exemplary embodiment of the present invention.
  • An eighth method, according to a preferred embodiment of the present invention uses a phase map of an image captured (say by an image capturer (a still camera, a video camera, etc.) The phase map may be calculated using Fourier Transform (FT), as known in the art.
  • In the eighth method, the face is found 1501 in the image, as described in further detail hereinabove.
  • Next, the image is cropped 1502, say 15% in each side (top, bottom, right and left), along a rectangle, as described in further detail, and illustrated using FIG. 16 hereinbelow.
  • The cropped image is resized 1503, say to hundred on hundred pixels.
  • Optionally, the image is modified, using histogram equalization 1504 (say Linear Histogram Equalization), as described in further detail hereinabove.
  • Next, there is verified the symmetry of the face in the image, through the following:
  • The image is divided 1505 into equal parts: a left side and right side, along a vertical line.
  • Next, the right side is flipped 1506 vertically.
  • Next, there is computed 1507 the Fourier Transform (FT) for the right side and for the left side. The resultant phase maps are denoted hereinbelow as Iright and Ileft respectively.
  • Next, there is computed 1508 the difference between Iright and Ileft, using Formula 4. where Diff denotes the difference between the two.

  • Diff=|I right−I left|  Formula 4
  • Next, there is computed 1509 symmetry for the image, using Formula 5.
  • Symmetry = Diff Number of pixels of half image Formula 5
  • According to an exemplary embodiment, the threshold for symmetry (i.e. the symmetry criterion) is set at 35. If symmetry<35, the face is successfully verified 1510 as symmetric. If symmetry>=35, the face is determined to be non-symmetric, and a new image has to be captured, as described in further detail hereinabove.
  • Reference is now made to FIG. 16, which illustrates cropping of an image of a face, according to an exemplary embodiment of the present invention.
  • According to an exemplary embodiment of the present invention, an image of a face may be cropped, say 15% of each size, a long a rectangle. Consequently the background is significantly removed from the image.
  • The cropping of the image may result in a more efficient and accurate face recognition, as the identifying is carried out on the face 1611 itself, without unnecessary processing of background details, such as a collar 1612, which have nothing to do with the face itself.
  • The removal of the background details may also ease identification of a face, by introducing increased similarity between face images of the same individual, especially as far as two dimensional (2D) images are concerned.
  • The methods for face recognition, as described hereinabove, may also be used in a variety of systems where symmetry information may prove helpful.
  • The systems may include, but are not limited to: 2D or 3D systems, security system, access control, HLS (Home Land Security), ATM (Automatic Teller Machines), web portals, or any application which requires recognition of the subject.
  • The systems may also include: passport picture capturing, standard image capturing (thus enforcing a standard for image capturing, say for e-Passport or e-ID generation, as known in the art).
  • The systems described in further detail hereinabove, may be implemented using a Personal Computer, an embedded system, a FPGA (Field Programmable Gate Array), or any other computing device.
  • Reference is now made to FIGS. 17A, 17B, and 17C, which illustrate a face recognition scenario, according to an exemplary embodiment of the present invention.
  • In a first recognition scenario, according to an exemplary embodiment of the present invention, a user approaches a face recognition system, say a face recognition system based on the seventh system described in further detail hereinabove.
  • The user may be asked to get closer to a camera (say using a message displayed on a video monitor), as illustrated in FIG. 17A.
  • Next, an image of the user's face is captured by the camera.
  • If the face verifier 710 finds the face in the image to be non-symmetric, the user is asked to look straight into the camera (say using a message displayed on a video monitor), as illustrated in FIG. 17B.
  • The camera captures a second image of the user who looks straight into the camera.
  • As the user looks straight into the camera, the face verifier 710 verifies that the user's face in the second image are indeed symmetric, as described in further detail hereinabove.
  • Consequently, data which includes the second image (or featured extracted from the second image) is forwarded to the face identifier 720, which identifies the user. The data may be forwarded over a wide area network 715, say the internet, as described in further detail hereinabove.
  • Upon successful identification of the user a relevant message is presented to the user, say a welcome message, as illustrated in FIG. 17C.
  • It is expected that during the life of this patent many relevant devices and systems will be developed and the scope of the terms herein, particularly of the terms “Camera”, “Image”, “Photo”, “Computer”, “Network”, “Internet”, and “Intranet”, is intended to include all such new technologies a priori.
  • It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the invention, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable sub-combination.
  • Although the invention has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, it is intended to embrace all such alternatives, modifications and variations that fall within the spirit and broad scope of the appended claims.
  • All publications, patents and patent applications mentioned in this specification are herein incorporated in their entirety by reference into the specification, to the same extent as if each individual publication, patent or patent application was specifically and individually indicated to be incorporated herein by reference. In addition, citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the present invention.

Claims (49)

1. A networked system for face recognition, the system comprising: a face verifier, configured to verify compliance of a face in an image with at least one predefined criterion, and conditioned upon compliance of the face with a predefined criterion, to forward the image for feature extraction; a feature extractor, associated with said face verifier, configured to extract a feature from the forwarded image; and a face identifier, communicating with said feature extractor over a network, configured to receive the extracted feature and identify the face in the forwarded image, using the extracted feature.
2. The networked system of claim 1, wherein said network is a wide area network.
3. The networked system of claim 1, wherein said network is the internet.
4. The networked system of claim 1, wherein the predefined criterion is indicative of symmetry of the face.
5. A networked system for face recognition, the system comprising: a face verifier, configured to verify compliance of a face in an image with at least one predefined criterion, and conditioned upon compliance of the face with a predefined criterion, to send the image over a network; a feature extractor, communicating with said face verifier over the network, and configured to receive said sent image and extract a feature from said received image; and a face identifier, associated with said feature extractor and configured to identify the face in the received image, using the extracted feature.
6. The networked system of claim 5, wherein said network is a wide area network.
7. The networked system of claim 5, wherein said network is the Internet.
8. A networked system for face recognition, the system comprising: a face verifier, configured to verify compliance of a face in an image with at least one predefined criterion, and conditioned upon compliance of the face with a predefined criterion, to send the image over a first network; a feature extractor, communicating with said face verifier over the first network, configured to receive said sent image, extract a feature from said received image, and sent said extracted feature over a second network; and a face identifier, communicating with said feature extractor over the second network, configured to receive the extracted feature and identify the face in the received image, using the extracted feature.
9. The networked system of claim 8, wherein at least one of said first and second network is a wide area network.
10. The networked system of claim 8, wherein at least one of said first and second network is the internet.
11. A networked system for face recognition, the system comprising: a face verifier, configured to verify compliance of a face in an image with at least one predefined criterion, and conditioned upon compliance of the face with a predefined criterion, to send data comprising at least a part of the image over a network; and a face identifier, communicating with said face verifier over the network, configured to receive the sent data and identify the face, using at least a part of the received data.
12. The networked system of claim 11, wherein said network is a wide area network.
13. The networked system of claim 11, wherein said network is the internet.
14. The networked system of claim 11, wherein said network is an intranet network.
15. The networked system of claim 11, further comprising an image capturer, associated with said face verifier, and configured to capture said image of said face.
16. The networked system of claim 11, further comprising a face detector, associated with said face verifier, and configured to detect said face in said image.
17. The networked system of claim 16, wherein said face detector is further configured to use a skin detection method, for detecting said face in said image.
18. The networked system of claim 16, wherein said face detector is further configured to use a Viola-Jones detection method, for detecting said face in said image.
19. The networked system of claim 16, wherein said face detector is further configured to use a Gabor Filter based method, for, detecting said face in said image.
20. The networked system of claim 11, further comprising an image cropper, associated with said face verifier, and configured to crop said image, thereby to remove background from said image.
21. The networked system of claim 11 further comprising an image resizer, associated with said face verifier, and configured to resize said image into a predefined size.
22. The networked system of claim 11, further comprising an image illumination quality improver, associated with said face verifier, and configured to improve a quality of illumination of said image.
23. The networked system of claim 22, wherein said image illumination quality improver is further configured to use Histogram Equalization, for improving said quality of illumination of said image.
24. The networked system of claim 11, wherein said face verifier is further configured to use an intensity map, for verifying the compliance of the face with the predefined criterion.
25. The networked system of claim 11, wherein said face verifier is further configured to use a gradient map, for verifying the compliance of the face with the predefined criterion.
26. The networked system of claim 11, wherein said face verifier is further configured to use a Fourier Transform phase map, for verifying the compliance of the face with the predefined criterion.
27. The networked system of claim 11, wherein said face verifier is further configured to measure said compliance of said face in each one of a plurality of input images, and select at least one image of said face among said plurality of input images, and wherein said measured compliance of said face in said at least one selected image of said face is highest amongst said input images.
28. The networked system of claim 27, wherein said plurality of input images are at least a part of a video sequence.
29. A networked system for face recognition, the system comprising: a face verifier, configured to verify compliance of a face in an image with at least one predefined criterion, and conditioned upon compliance of the face with a predefined criterion, to send data comprising at least a part of the image over a network; and a face database updater, communicating with said face verifier over the network, and configured to receive the sent data and update a face database with at least a part of the received data.
30. The networked system of claim 29, wherein said network is a wide area network.
31. A networked system for face recognition, the networked system comprising: a face verifier, configured to verify compliance of a face in an image with at least one predefined criterion, and restrict forwarding of data comprising at least a part of the image over a network, according to results of the verification of the compliance.
32. Method for face recognition, the method comprising: a) verifying compliance of a face in an image with a predefined criterion; b) sending data comprising at least a part of the image over a network, for identification, said sending conditioned upon compliance of the face with the predefined criterion; and c) identifying the face in the image, using at least a part of the sent data.
33. The method of claim 32, further comprising capturing said image of said face.
34. The method of claim 32, further comprising detecting said face in said image.
35. The method of claim 32, further comprising using a skin detection method, for detecting said face in said image.
36. The method of claim 32, further comprising using a Viola-Jones detection method, for detecting said face in said image.
37. The method of claim 32, further comprising using a Gabor Filter based method, for detecting said face in said image.
38. The method of claim 32, further comprising cropping said image, thereby removing background from said image.
39. The method of claim 32, further comprising resizing said image into a predefined size.
40. The method of claim 32, further comprising improving a quality of illumination of said image.
41. The method of claim 40, further comprising using Histogram Equalization, for improving said quality of illumination of said image.
42. The method of claim 32, further comprising using an intensity map, for verifying the compliance with the predefined criterion.
43. The method of claim 32, further comprising using a gradient map, for verifying the compliance with the predefined criterion.
44. The method of claim 32, further comprising using a Fourier Transform phase map, for verifying the compliance with the predefined criterion.
45. The method of claim 32, further comprising measuring said compliance of said face in each one of a plurality of input images, and selecting at least one image of said face among said plurality of input images, and wherein said measured compliance of said face in said at least one selected image of said face is highest amongst said input images.
46. The method of claim 45, wherein said plurality of input images are at least a part of a video sequence.
47. Method for face recognition, the method comprising:
a) verifying compliance of a face in an image with a predefined criterion; b) sending data comprising at least a part of the image over a network, said sending conditioned upon compliance of the face with a predefined criterion; and c) updating a database of images with at least a part of the sent data.
48. The method of claim 47, wherein said network is a wide area network.
49. Method for face recognition, the method comprising: a) verifying compliance of a face in an image with a predefined criterion; and b) controlling forwarding of data comprising at least a part of the image through a network, according to a result of said verifying of the compliance.
US12/919,092 2008-07-02 2009-06-24 Networked Face Recognition System Abandoned US20100329568A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/919,092 US20100329568A1 (en) 2008-07-02 2009-06-24 Networked Face Recognition System

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US13371108P 2008-07-02 2008-07-02
US12/919,092 US20100329568A1 (en) 2008-07-02 2009-06-24 Networked Face Recognition System
PCT/IB2009/052722 WO2010001311A1 (en) 2008-07-02 2009-06-24 Networked face recognition system

Publications (1)

Publication Number Publication Date
US20100329568A1 true US20100329568A1 (en) 2010-12-30

Family

ID=41465533

Family Applications (3)

Application Number Title Priority Date Filing Date
US12/919,076 Expired - Fee Related US8600121B2 (en) 2008-07-02 2009-06-24 Face recognition system and method
US12/919,092 Abandoned US20100329568A1 (en) 2008-07-02 2009-06-24 Networked Face Recognition System
US14/027,472 Abandoned US20140016836A1 (en) 2008-07-02 2013-09-16 Face recognition system and method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/919,076 Expired - Fee Related US8600121B2 (en) 2008-07-02 2009-06-24 Face recognition system and method

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14/027,472 Abandoned US20140016836A1 (en) 2008-07-02 2013-09-16 Face recognition system and method

Country Status (3)

Country Link
US (3) US8600121B2 (en)
EP (2) EP2291796A1 (en)
WO (2) WO2010001311A1 (en)

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070098220A1 (en) * 2005-10-31 2007-05-03 Maurizio Pilu Method of triggering a detector to detect a moving feature within a video stream
US8064645B1 (en) 2011-01-20 2011-11-22 Daon Holdings Limited Methods and systems for authenticating users
US8085992B1 (en) 2011-01-20 2011-12-27 Daon Holdings Limited Methods and systems for capturing biometric data
US20120123821A1 (en) * 2010-11-16 2012-05-17 Raytheon Company System and Method for Risk Assessment of an Asserted Identity
US20130195316A1 (en) * 2012-01-30 2013-08-01 Accenture Global Services Limited System and method for face capture and matching
US20140079319A1 (en) * 2012-09-20 2014-03-20 Htc Corporation Methods for enhancing images and apparatuses using the same
US9195893B2 (en) 2012-04-09 2015-11-24 Accenture Global Services Limited Biometric matching technology
US9208608B2 (en) 2012-05-23 2015-12-08 Glasses.Com, Inc. Systems and methods for feature tracking
US9236024B2 (en) 2011-12-06 2016-01-12 Glasses.Com Inc. Systems and methods for obtaining a pupillary distance measurement using a mobile computing device
US9286715B2 (en) 2012-05-23 2016-03-15 Glasses.Com Inc. Systems and methods for adjusting a virtual try-on
CN105427421A (en) * 2015-11-16 2016-03-23 苏州市公安局虎丘分局 Entrance guard control method based on face recognition
CN105788048A (en) * 2016-04-13 2016-07-20 时建华 Electronic lock system achieving recognition through fingerprints
CN105893969A (en) * 2016-04-01 2016-08-24 张海东 Using method of automatic face recognition system
US9483853B2 (en) 2012-05-23 2016-11-01 Glasses.Com Inc. Systems and methods to display rendered images
US9558415B2 (en) 2011-06-07 2017-01-31 Accenture Global Services Limited Biometric authentication technology
CN106650623A (en) * 2016-11-18 2017-05-10 广东工业大学 Face detection-based method for verifying personnel and identity document for exit and entry
US20170169205A1 (en) * 2015-12-15 2017-06-15 International Business Machines Corporation Controlling privacy in a face recognition application
CN106910266A (en) * 2016-08-31 2017-06-30 王玲 Distributed non-contact intelligent gate control system
CN107179827A (en) * 2017-04-01 2017-09-19 深圳怡化电脑股份有限公司 The intelligent interactive method and system of a kind of finance device
US9922048B1 (en) * 2014-12-01 2018-03-20 Securus Technologies, Inc. Automated background check via facial recognition
US10055646B2 (en) 2015-05-29 2018-08-21 Accenture Global Solutions Limited Local caching for object recognition
WO2018192448A1 (en) * 2017-04-20 2018-10-25 杭州海康威视数字技术股份有限公司 People-credentials comparison authentication method, system and camera
US10146797B2 (en) 2015-05-29 2018-12-04 Accenture Global Services Limited Face recognition image data cache
CN108985134A (en) * 2017-06-01 2018-12-11 重庆中科云丛科技有限公司 Face In vivo detection and brush face method of commerce and system based on binocular camera
CN109117700A (en) * 2017-06-26 2019-01-01 三星电子株式会社 Face authentication method and apparatus
WO2019042689A1 (en) 2017-08-29 2019-03-07 Siemens Aktiengesellschaft Person recognition in areas with limited data transmission and data processing
US20190118773A1 (en) * 2017-10-25 2019-04-25 Hyundai Motor Company User authentication system, user authentication method and server
US10297059B2 (en) 2016-12-21 2019-05-21 Motorola Solutions, Inc. Method and image processor for sending a combined image to human versus machine consumers
CN110309691A (en) * 2018-03-27 2019-10-08 腾讯科技(深圳)有限公司 A kind of face identification method, device, server and storage medium
WO2019204945A1 (en) * 2018-04-26 2019-10-31 C2Ro Cloud Robotics Inc. System and method for scalable cloud-robotics based face recognition and face analysis
CN110570549A (en) * 2019-07-26 2019-12-13 华中科技大学 Intelligent unlocking method and corresponding device
US11138410B1 (en) * 2020-08-25 2021-10-05 Covar Applied Technologies, Inc. 3-D object detection and classification from imagery
US11182997B2 (en) * 2018-10-12 2021-11-23 Nec Corporation Information processing apparatus, information processing method, and storage medium
US11528269B2 (en) 2020-08-05 2022-12-13 Bank Of America Corporation Application for requesting multi-person authentication
US11792187B2 (en) 2020-08-05 2023-10-17 Bank Of America Corporation Multi-person authentication
US11792188B2 (en) 2020-08-05 2023-10-17 Bank Of America Corporation Application for confirming multi-person authentication

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2011224140B2 (en) * 2010-09-27 2015-12-17 Multitrode Pty Ltd Controlling Access to a Control Panel Compartment
US8848068B2 (en) * 2012-05-08 2014-09-30 Oulun Yliopisto Automated recognition algorithm for detecting facial expressions
TW201401184A (en) * 2012-06-18 2014-01-01 Altek Corp Smart reminding apparatus and method thereof
CN103514428A (en) * 2012-06-18 2014-01-15 华晶科技股份有限公司 Intelligent prompting device and method
CA2934514C (en) * 2013-12-19 2021-04-06 Avigilon Fortress Corporation System and method for identifying faces in unconstrained media
US10127754B2 (en) * 2014-04-25 2018-11-13 Vivint, Inc. Identification-based barrier techniques
US10657749B2 (en) 2014-04-25 2020-05-19 Vivint, Inc. Automatic system access using facial recognition
US10235822B2 (en) 2014-04-25 2019-03-19 Vivint, Inc. Automatic system access using facial recognition
US10274909B2 (en) 2014-04-25 2019-04-30 Vivint, Inc. Managing barrier and occupancy based home automation system
US11615663B1 (en) * 2014-06-17 2023-03-28 Amazon Technologies, Inc. User authentication system
US9986086B2 (en) * 2014-07-31 2018-05-29 Samsung Electronics Co., Ltd. Mobile terminal and method of operating the same
US9389083B1 (en) 2014-12-31 2016-07-12 Motorola Solutions, Inc. Method and apparatus for prediction of a destination and movement of a person of interest
CN105590103B (en) * 2015-12-30 2019-10-01 中国银联股份有限公司 Eyeball recognition methods and system
US10977509B2 (en) 2017-03-27 2021-04-13 Samsung Electronics Co., Ltd. Image processing method and apparatus for object detection
CN109886864B (en) * 2017-12-06 2021-03-09 杭州海康威视数字技术股份有限公司 Privacy mask processing method and device
CN109410133B (en) * 2018-09-30 2021-08-24 北京航空航天大学青岛研究院 Face texture repairing method based on 3DMM
US10853631B2 (en) 2019-07-24 2020-12-01 Advanced New Technologies Co., Ltd. Face verification method and apparatus, server and readable storage medium
US11283937B1 (en) * 2019-08-15 2022-03-22 Ikorongo Technology, LLC Sharing images based on face matching in a network

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5164992A (en) * 1990-11-01 1992-11-17 Massachusetts Institute Of Technology Face recognition system
US5930804A (en) * 1997-06-09 1999-07-27 Philips Electronics North America Corporation Web-based biometric authentication system and method
US20030059124A1 (en) * 1999-04-16 2003-03-27 Viisage Technology, Inc. Real-time facial recognition and verification system
US20030086593A1 (en) * 2001-05-31 2003-05-08 Chengjun Liu Feature based classification
US20030115490A1 (en) * 2001-07-12 2003-06-19 Russo Anthony P. Secure network and networked devices using biometrics
US20030215114A1 (en) * 2002-05-15 2003-11-20 Biocom, Llc Identity verification system
US20040117638A1 (en) * 2002-11-21 2004-06-17 Monroe David A. Method for incorporating facial recognition technology in a multimedia surveillance system
US20040151381A1 (en) * 2002-11-29 2004-08-05 Porter Robert Mark Stefan Face detection
US6806980B2 (en) * 2000-12-28 2004-10-19 Xerox Corporation Adaptive illumination correction of scanned images
US20060000895A1 (en) * 2004-07-01 2006-01-05 American Express Travel Related Services Company, Inc. Method and system for facial recognition biometrics on a smartcard
US20060050932A1 (en) * 2000-09-15 2006-03-09 Tumey David M Fingerprint verification system
US20060050933A1 (en) * 2004-06-21 2006-03-09 Hartwig Adam Single image based multi-biometric system and method
US20060064716A1 (en) * 2000-07-24 2006-03-23 Vivcom, Inc. Techniques for navigating multiple video streams
US7050608B2 (en) * 2001-03-09 2006-05-23 Kabushiki Kaisha Toshiba Face image recognition apparatus
US20060206724A1 (en) * 2005-02-16 2006-09-14 David Schaufele Biometric-based systems and methods for identity verification
US7221809B2 (en) * 2001-12-17 2007-05-22 Genex Technologies, Inc. Face recognition system and method
US20070152037A1 (en) * 2005-12-29 2007-07-05 Industrial Technology Research Institute Three-dimensional face recognition system and method
US7298876B1 (en) * 2002-11-04 2007-11-20 R2 Technology, Inc. Method and apparatus for quality assurance and quality control in radiological equipment using automatic analysis tools
US20080232651A1 (en) * 2007-03-22 2008-09-25 Artnix Inc. Apparatus and method for detecting face region

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5625704A (en) * 1994-11-10 1997-04-29 Ricoh Corporation Speaker recognition using spatiotemporal cues
JP3452685B2 (en) * 1995-05-10 2003-09-29 三菱電機株式会社 Face image processing device
GB2341231A (en) * 1998-09-05 2000-03-08 Sharp Kk Face detection in an image
SG91841A1 (en) * 1999-11-03 2002-10-15 Kent Ridge Digital Labs Face direction estimation using a single gray-level image
US7221780B1 (en) * 2000-06-02 2007-05-22 Sony Corporation System and method for human face detection in color graphics images
US7499104B2 (en) * 2003-05-16 2009-03-03 Pixel Instruments Corporation Method and apparatus for determining relative timing of image and associated information
US7483569B2 (en) * 2003-05-29 2009-01-27 Carnegie Mellon University Reduced complexity correlation filters
US20050063568A1 (en) * 2003-09-24 2005-03-24 Shih-Ching Sun Robust face detection algorithm for real-time video sequence
KR100559471B1 (en) * 2003-12-17 2006-03-10 한국전자통신연구원 System and method for detecting face using symmetric axis
US7436988B2 (en) * 2004-06-03 2008-10-14 Arizona Board Of Regents 3D face authentication and recognition based on bilateral symmetry analysis
JP4180027B2 (en) * 2004-08-27 2008-11-12 株式会社豊田中央研究所 Facial part position detection apparatus and method, and program
GB2432064B (en) * 2005-10-31 2011-01-19 Hewlett Packard Development Co Method of triggering a detector to detect a moving feature within a video stream
US8265349B2 (en) * 2006-02-07 2012-09-11 Qualcomm Incorporated Intra-mode region-of-interest video object segmentation
GB2451050B (en) * 2006-05-05 2011-08-31 Parham Aarabi Method, system and computer program product for automatic and semiautomatic modification of digital images of faces
US7860280B2 (en) * 2006-06-09 2010-12-28 Samsung Electronics Co., Ltd. Facial feature detection method and device
JP4910507B2 (en) * 2006-06-29 2012-04-04 コニカミノルタホールディングス株式会社 Face authentication system and face authentication method
US7403643B2 (en) * 2006-08-11 2008-07-22 Fotonation Vision Limited Real-time face tracking in a digital image acquisition device
JP4289415B2 (en) * 2007-03-27 2009-07-01 セイコーエプソン株式会社 Image processing for image transformation

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5164992A (en) * 1990-11-01 1992-11-17 Massachusetts Institute Of Technology Face recognition system
US5930804A (en) * 1997-06-09 1999-07-27 Philips Electronics North America Corporation Web-based biometric authentication system and method
US20030059124A1 (en) * 1999-04-16 2003-03-27 Viisage Technology, Inc. Real-time facial recognition and verification system
US20060064716A1 (en) * 2000-07-24 2006-03-23 Vivcom, Inc. Techniques for navigating multiple video streams
US20060050932A1 (en) * 2000-09-15 2006-03-09 Tumey David M Fingerprint verification system
US6806980B2 (en) * 2000-12-28 2004-10-19 Xerox Corporation Adaptive illumination correction of scanned images
US7050608B2 (en) * 2001-03-09 2006-05-23 Kabushiki Kaisha Toshiba Face image recognition apparatus
US20030086593A1 (en) * 2001-05-31 2003-05-08 Chengjun Liu Feature based classification
US20030115490A1 (en) * 2001-07-12 2003-06-19 Russo Anthony P. Secure network and networked devices using biometrics
US7221809B2 (en) * 2001-12-17 2007-05-22 Genex Technologies, Inc. Face recognition system and method
US6853739B2 (en) * 2002-05-15 2005-02-08 Bio Com, Llc Identity verification system
US20030215114A1 (en) * 2002-05-15 2003-11-20 Biocom, Llc Identity verification system
US7298876B1 (en) * 2002-11-04 2007-11-20 R2 Technology, Inc. Method and apparatus for quality assurance and quality control in radiological equipment using automatic analysis tools
US20040117638A1 (en) * 2002-11-21 2004-06-17 Monroe David A. Method for incorporating facial recognition technology in a multimedia surveillance system
US20040151381A1 (en) * 2002-11-29 2004-08-05 Porter Robert Mark Stefan Face detection
US20060050933A1 (en) * 2004-06-21 2006-03-09 Hartwig Adam Single image based multi-biometric system and method
US20060000895A1 (en) * 2004-07-01 2006-01-05 American Express Travel Related Services Company, Inc. Method and system for facial recognition biometrics on a smartcard
US20060206724A1 (en) * 2005-02-16 2006-09-14 David Schaufele Biometric-based systems and methods for identity verification
US20070152037A1 (en) * 2005-12-29 2007-07-05 Industrial Technology Research Institute Three-dimensional face recognition system and method
US7620217B2 (en) * 2005-12-29 2009-11-17 Industrial Technology Research Institute Three-dimensional face recognition system and method
US20080232651A1 (en) * 2007-03-22 2008-09-25 Artnix Inc. Apparatus and method for detecting face region

Cited By (82)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070098220A1 (en) * 2005-10-31 2007-05-03 Maurizio Pilu Method of triggering a detector to detect a moving feature within a video stream
US20120123821A1 (en) * 2010-11-16 2012-05-17 Raytheon Company System and Method for Risk Assessment of an Asserted Identity
US9519818B2 (en) 2011-01-20 2016-12-13 Daon Holdings Limited Methods and systems for capturing biometric data
US8064645B1 (en) 2011-01-20 2011-11-22 Daon Holdings Limited Methods and systems for authenticating users
US8457370B2 (en) 2011-01-20 2013-06-04 Daon Holdings Limited Methods and systems for authenticating users with captured palm biometric data
US10607054B2 (en) 2011-01-20 2020-03-31 Daon Holdings Limited Methods and systems for capturing biometric data
US8548206B2 (en) 2011-01-20 2013-10-01 Daon Holdings Limited Methods and systems for capturing biometric data
US9298999B2 (en) 2011-01-20 2016-03-29 Daon Holdings Limited Methods and systems for capturing biometric data
US9112858B2 (en) 2011-01-20 2015-08-18 Daon Holdings Limited Methods and systems for capturing biometric data
US9679193B2 (en) 2011-01-20 2017-06-13 Daon Holdings Limited Methods and systems for capturing biometric data
US9202102B1 (en) 2011-01-20 2015-12-01 Daon Holdings Limited Methods and systems for capturing biometric data
US9400915B2 (en) 2011-01-20 2016-07-26 Daon Holdings Limited Methods and systems for capturing biometric data
US8085992B1 (en) 2011-01-20 2011-12-27 Daon Holdings Limited Methods and systems for capturing biometric data
US10235550B2 (en) 2011-01-20 2019-03-19 Daon Holdings Limited Methods and systems for capturing biometric data
US9990528B2 (en) 2011-01-20 2018-06-05 Daon Holdings Limited Methods and systems for capturing biometric data
US9519820B2 (en) 2011-01-20 2016-12-13 Daon Holdings Limited Methods and systems for authenticating users
US9519821B2 (en) 2011-01-20 2016-12-13 Daon Holdings Limited Methods and systems for capturing biometric data
US9558415B2 (en) 2011-06-07 2017-01-31 Accenture Global Services Limited Biometric authentication technology
US9600730B2 (en) 2011-06-07 2017-03-21 Accenture Global Services Limited Biometric authentication technology
US9236024B2 (en) 2011-12-06 2016-01-12 Glasses.Com Inc. Systems and methods for obtaining a pupillary distance measurement using a mobile computing device
US20130195316A1 (en) * 2012-01-30 2013-08-01 Accenture Global Services Limited System and method for face capture and matching
US9230157B2 (en) * 2012-01-30 2016-01-05 Accenture Global Services Limited System and method for face capture and matching
US9875392B2 (en) * 2012-01-30 2018-01-23 Accenture Global Services Limited System and method for face capture and matching
US9773157B2 (en) 2012-01-30 2017-09-26 Accenture Global Services Limited System and method for face capture and matching
US9390338B2 (en) 2012-04-09 2016-07-12 Accenture Global Services Limited Biometric matching technology
US9195893B2 (en) 2012-04-09 2015-11-24 Accenture Global Services Limited Biometric matching technology
US9483689B2 (en) 2012-04-09 2016-11-01 Accenture Global Services Limited Biometric matching technology
US9582723B2 (en) 2012-04-09 2017-02-28 Accenture Global Services Limited Biometric matching technology
US9292749B2 (en) 2012-04-09 2016-03-22 Accenture Global Services Limited Biometric matching technology
US9286715B2 (en) 2012-05-23 2016-03-15 Glasses.Com Inc. Systems and methods for adjusting a virtual try-on
US9235929B2 (en) 2012-05-23 2016-01-12 Glasses.Com Inc. Systems and methods for efficiently processing virtual 3-D data
US9483853B2 (en) 2012-05-23 2016-11-01 Glasses.Com Inc. Systems and methods to display rendered images
US9208608B2 (en) 2012-05-23 2015-12-08 Glasses.Com, Inc. Systems and methods for feature tracking
US10147233B2 (en) 2012-05-23 2018-12-04 Glasses.Com Inc. Systems and methods for generating a 3-D model of a user for a virtual try-on product
US9378584B2 (en) 2012-05-23 2016-06-28 Glasses.Com Inc. Systems and methods for rendering virtual try-on products
US9311746B2 (en) 2012-05-23 2016-04-12 Glasses.Com Inc. Systems and methods for generating a 3-D model of a virtual try-on product
US20140079319A1 (en) * 2012-09-20 2014-03-20 Htc Corporation Methods for enhancing images and apparatuses using the same
US11798113B1 (en) 2014-12-01 2023-10-24 Securus Technologies, Llc Automated background check via voice pattern matching
US10902054B1 (en) 2014-12-01 2021-01-26 Securas Technologies, Inc. Automated background check via voice pattern matching
US9922048B1 (en) * 2014-12-01 2018-03-20 Securus Technologies, Inc. Automated background check via facial recognition
US20190129904A1 (en) * 2015-05-29 2019-05-02 Accenture Global Services Limited Face recognition image data cache
US10402660B2 (en) 2015-05-29 2019-09-03 Accenture Global Solutions Limited Local caching for object recognition
US10762127B2 (en) * 2015-05-29 2020-09-01 Accenture Global Services Limited Face recognition image data cache
US11487812B2 (en) 2015-05-29 2022-11-01 Accenture Global Services Limited User identification using biometric image data cache
US10146797B2 (en) 2015-05-29 2018-12-04 Accenture Global Services Limited Face recognition image data cache
US10055646B2 (en) 2015-05-29 2018-08-21 Accenture Global Solutions Limited Local caching for object recognition
CN105427421A (en) * 2015-11-16 2016-03-23 苏州市公安局虎丘分局 Entrance guard control method based on face recognition
US9858404B2 (en) * 2015-12-15 2018-01-02 International Business Machines Corporation Controlling privacy in a face recognition application
US9934397B2 (en) 2015-12-15 2018-04-03 International Business Machines Corporation Controlling privacy in a face recognition application
US20170169205A1 (en) * 2015-12-15 2017-06-15 International Business Machines Corporation Controlling privacy in a face recognition application
US20170169206A1 (en) * 2015-12-15 2017-06-15 International Business Machines Corporation Controlling privacy in a face recognition application
US10255453B2 (en) 2015-12-15 2019-04-09 International Business Machines Corporation Controlling privacy in a face recognition application
US9747430B2 (en) * 2015-12-15 2017-08-29 International Business Machines Corporation Controlling privacy in a face recognition application
CN105893969A (en) * 2016-04-01 2016-08-24 张海东 Using method of automatic face recognition system
CN105788048A (en) * 2016-04-13 2016-07-20 时建华 Electronic lock system achieving recognition through fingerprints
CN106910266A (en) * 2016-08-31 2017-06-30 王玲 Distributed non-contact intelligent gate control system
CN106650623A (en) * 2016-11-18 2017-05-10 广东工业大学 Face detection-based method for verifying personnel and identity document for exit and entry
US10297059B2 (en) 2016-12-21 2019-05-21 Motorola Solutions, Inc. Method and image processor for sending a combined image to human versus machine consumers
CN107179827A (en) * 2017-04-01 2017-09-19 深圳怡化电脑股份有限公司 The intelligent interactive method and system of a kind of finance device
WO2018192448A1 (en) * 2017-04-20 2018-10-25 杭州海康威视数字技术股份有限公司 People-credentials comparison authentication method, system and camera
US11256902B2 (en) 2017-04-20 2022-02-22 Hangzhou Hikvision Digital Technology Co., Ltd. People-credentials comparison authentication method, system and camera
CN108985134A (en) * 2017-06-01 2018-12-11 重庆中科云丛科技有限公司 Face In vivo detection and brush face method of commerce and system based on binocular camera
KR102299847B1 (en) 2017-06-26 2021-09-08 삼성전자주식회사 Face verifying method and apparatus
US10579865B2 (en) * 2017-06-26 2020-03-03 Samsung Electronics Co., Ltd. Facial verification method and apparatus
KR20190001066A (en) * 2017-06-26 2019-01-04 삼성전자주식회사 Face verifying method and apparatus
CN109117700A (en) * 2017-06-26 2019-01-01 三星电子株式会社 Face authentication method and apparatus
WO2019042689A1 (en) 2017-08-29 2019-03-07 Siemens Aktiengesellschaft Person recognition in areas with limited data transmission and data processing
US20190118773A1 (en) * 2017-10-25 2019-04-25 Hyundai Motor Company User authentication system, user authentication method and server
CN110309691A (en) * 2018-03-27 2019-10-08 腾讯科技(深圳)有限公司 A kind of face identification method, device, server and storage medium
US11367311B2 (en) * 2018-03-27 2022-06-21 Tencent Technology (Shenzhen) Company Limited Face recognition method and apparatus, server, and storage medium
WO2019204945A1 (en) * 2018-04-26 2019-10-31 C2Ro Cloud Robotics Inc. System and method for scalable cloud-robotics based face recognition and face analysis
US11527105B2 (en) * 2018-04-26 2022-12-13 C2Ro Cloud Robotics Inc. System and method for scalable cloud-robotics based face recognition and face analysis
US20230260321A1 (en) * 2018-04-26 2023-08-17 C2Ro Cloud Robotics Inc. System And Method For Scalable Cloud-Robotics Based Face Recognition And Face Analysis
US11182997B2 (en) * 2018-10-12 2021-11-23 Nec Corporation Information processing apparatus, information processing method, and storage medium
CN110570549A (en) * 2019-07-26 2019-12-13 华中科技大学 Intelligent unlocking method and corresponding device
US11792187B2 (en) 2020-08-05 2023-10-17 Bank Of America Corporation Multi-person authentication
US11528269B2 (en) 2020-08-05 2022-12-13 Bank Of America Corporation Application for requesting multi-person authentication
US11695760B2 (en) 2020-08-05 2023-07-04 Bank Of America Corporation Application for requesting multi-person authentication
US11792188B2 (en) 2020-08-05 2023-10-17 Bank Of America Corporation Application for confirming multi-person authentication
US20220067342A1 (en) * 2020-08-25 2022-03-03 Covar Applied Technologies, Inc. 3-d object detection and classification from imagery
US11727575B2 (en) * 2020-08-25 2023-08-15 Covar Llc 3-D object detection and classification from imagery
US11138410B1 (en) * 2020-08-25 2021-10-05 Covar Applied Technologies, Inc. 3-D object detection and classification from imagery

Also Published As

Publication number Publication date
US20110091080A1 (en) 2011-04-21
US20140016836A1 (en) 2014-01-16
EP2291795A1 (en) 2011-03-09
WO2010001310A1 (en) 2010-01-07
EP2291796A1 (en) 2011-03-09
WO2010001311A1 (en) 2010-01-07
US8600121B2 (en) 2013-12-03

Similar Documents

Publication Publication Date Title
US20100329568A1 (en) Networked Face Recognition System
US10650261B2 (en) System and method for identifying re-photographed images
US10810423B2 (en) Iris liveness detection for mobile devices
Jee et al. Liveness detection for embedded face recognition system
US20210287026A1 (en) Method and apparatus with liveness verification
US20180034852A1 (en) Anti-spoofing system and methods useful in conjunction therewith
US8280120B2 (en) Fraud resistant biometric financial transaction system and method
US20170262472A1 (en) Systems and methods for recognition of faces e.g. from mobile-device-generated images of faces
US9189686B2 (en) Apparatus and method for iris image analysis
EP2704052A1 (en) Transaction verification system
CN106529414A (en) Method for realizing result authentication through image comparison
WO2008072622A1 (en) Face authentication device
WO2020190397A1 (en) Authentication verification using soft biometric traits
US20220277311A1 (en) A transaction processing system and a transaction method based on facial recognition
Ravi et al. A study on face recognition technique based on Eigenface
Chen et al. Iris recognition using 3D co-occurrence matrix
US20120219192A1 (en) Method of controlling a session at a self-service terminal, and a self-service terminal
Olatinwo et al. Iris recognition technology: implementation, application, and security consideration
Habeeb Comparison between physiological and behavioral characteristics of biometric system
Ferdinand et al. Atm security system modeling using face recognition with facenet and haar cascade
Monwar et al. A robust authentication system using multiple biometrics
KR102579610B1 (en) Apparatus for Detecting ATM Abnormal Behavior and Driving Method Thereof
Dixit et al. SIFRS: Spoof Invariant Facial Recognition System (A Helping Hand for Visual Impaired People)
Srivastava et al. A Machine Learning and IoT-based Anti-spoofing Technique for Liveness Detection and Face Recognition
Fagbolu et al. Secured banking operations with face-based automated teller machine

Legal Events

Date Code Title Description
AS Assignment

Owner name: C-TRUE LTD., ISRAEL

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GAMLIEL, AVIHU MEIR;GOLDENBERG, SHMUEL;TSIPIS, FELIX;AND OTHERS;SIGNING DATES FROM 20100810 TO 20100817;REEL/FRAME:024880/0719

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION