While existing automatic gender recognition techniques use static images and compared fixed facial features to recognise smiles, the new method would use the dynamic movement or a video of the smile to automatically distinguish between men and women.
The study, led by Hassan Ugail, professor at Britain's University of Bradford, mapped 49 landmarks on the face, mainly around the eyes, mouth and down the nose.
Then they used these to assess how the face changes as we smile caused by the underlying muscle movements -- including both changes in distances between the different points and the 'flow' of the smile.
The results showed noticeable differences between men and women and that women's smiles were more expansive.
"Anecdotally, women are thought to be more expressive in how they smile, and our research has borne this out. Women definitely have broader smiles, expanding their mouth and lip area far more than men," said Hassan Ugail.
For the study, published in The Visual Computer: International Journal of Computer Graphics, the team created an algorithm using their analysis and tested it against video footage of more than 100 people as they smiled.
The computer was able to correctly determine gender in 86 per cent of cases and the team believe the accuracy could easily be improved.
"Because this system measures the underlying muscle movement of the face during a smile, we believe these dynamics will remain the same even if external physical features change, following surgery for example," Ugail said.
"This kind of facial recognition could become a next- generation biometric, as it's not dependent on one feature, but on a dynamic that's unique to an individual and would be very difficult to mimic or alter," he noted.
(This story has not been edited by Business Standard staff and is auto-generated from a syndicated feed.)