The present technology captures the direction that the eyes point in while the subject is a distance away from the camera.
About
The present technology captures the direction that the eyes point in while the subject is a distance away from the camera. This offers the potential for intuitive human-computer interfaces, allowing for a greater interactivity, more intelligent behavior, and increased flexibility. A two-camera system is provided that detects the face from a fixed, wide-angle camera, estimates a rough location for the eye region using an eye detector based on topographic features, and directs another active pan-tilt-zoom camera to focus in on this eye region. Additionally, an eye gaze estimation approach is provided for point-of-regard (PoG) tracking on a large viewing screen. To allow for greater head pose freedom, a calibration approach facilitates finding the 3D eyeball location, eyeball radius, and fovea position. Moreover, both the iris center and iris contour points are mapped to the eyeball sphere (creating a 3D iris disk) to get the optical axis; the fovea is then rotated accordingly, and the final, visual axis gaze direction computed. This gaze estimation approach may be integrated into a two-camera system, permitting natural, non-intrusive, pose-invariant point of gaze (PoG) estimation in distance, and allowing user translational freedom without resorting to infrared or complex hardware setups such as stereo-cameras or “smart rooms.”
Key Benefits
• Real time robust tracking of hand using two orthogonal cameras without any intrusive glove or marks. • Accurate pointing in a cursor resolution. • Intuitive drawing in a 3D place. • Feasible for finger pointing in a long distance.
Applications
• Real time robust tracking of hand using two orthogonal cameras without any intrusive glove or marks. • Accurate pointing in a cursor resolution. • Intuitive drawing in a 3D place. • Feasible for finger pointing in a long distance.