When you come home from work at night, does your robot greet you at the door expectantly, or does it sit there impassively in its recharging node because it can't tell the difference between you, the mailman, and Emilio Estevez? Today's computer scientists are hard at work making sure tomorrow's robots won't leave you feeling emotionally shunned. Check out how researchers at the Carnegie Mellon University Robotics Institute are using something called "Active Appearance Modeling" to improve face recognition algorithms that will make your bot snap to attention when it sees your face.
Recognizing a face is harder than it sounds. Using Active Appearance Modeling (one of the common methods in use today), a computer has to compare a face it sees to an "average face" it has previously learned. It works pretty well when the subject smiles and stares right into the computer's camera, but in real life, lighting, facial expression and "3D pose variation" present serious obstacles.
The Robotics Institute team is working on that last bit. Whenever you turn your head, part of your face is occluded. Without the right features to make its comparison, the computer can't recognize you. New algorithms and programming methods allow for the creation of 3D face meshes that can be adjusted on the fly to fit the subject's face, even if she turns partly away from the camera.
Of course, the government will use this technology to track our every move long before we have friendly helper robots who know us on sight, but it's nice to know we live in a world where something called the Robotics Institute actually exists. Image by: Robotics Institute.
AAM Fitting Algorithms. [Carnegie Mellon University Robotics Institute]