In current techno world where a kid is not capable of eating his/her food but he is capable of using apps in our smartphones now, researchers are trying to make new techniques to ease the human-mobile interaction. The latest trend and research is “eyephone”.Usage of this technology is so easy that a movement of your eye can change applications in our mobile we can say it’s a hand free tool.Even a wink can show you different functionalities of your smartphone.The working of this technology is Eyephone tracks the user’s eye movement across the phone’s display using the camera mounted on the front of the phone; more specifically, machine learning algorithms are used to:
1. Track the eye and infer its position on the mobile phone display as a user views a particular application.
2. Detect eye blinks that emulate mouse clicks to activate the target application under view.
The live example of Eyephonetechnology is Nokia N810, which is capable of tracking the position of the eye on the display, mapping this positions to an application that is activated with a wink.
With the evolution of both software and hardware on the mobile devices, the users are becoming more demanding of the user interface that provides both functionality and pleasant user experience. The goal of mobile interaction researchers is to understand the requirements and needs of mobile users. Compared with stationary devices mobile devices have specific, often restricted, input and output requirements. A goal that is often named is to overcome the limitations of mobile devices. However, exploiting the special opportunities of mobile users can also be seen as a central goal.
Human-Computer Interaction (HCI) researchers and phone vendors are continuously searching for new approaches to reduce the effort users exert when accessing applications on limited form factor devices such as mobile phones. The most significant innovation of the past few years is the adoption of touchscreen technology introduced with the Apple iPhone and recently followed by all the other major vendors, such as Nokia and HTC. The touchscreen has changed the way people interact with their mobile phones because of it provides an intuitive way to perform actions using the movement of one or more fingers on the display (e.g., pinching a photo to zoom in and out, or panning to move a map).
As a human sometimes we people get confused with the reactions of human so how it is possible to understand the eye movement by a mobile now that is a challenge for researchers.Considering any situation like a person is running or walking by any reason the image is not clear in camera then how it reads the eye of a human for any action to be performed. For making this task easy the work is divided in phases:
1. an eye detection phase;
2. an open eye template creation phase;
3. an eye tracking phase;
4. a blink detection phase. In what follows, we discuss each of the phases in turn. Eye Detection.
By applying a motion analysis technique which operates on consecutive frames, this phase consists of finding the contour of the eyes. The eye pair is identified by the left and right eye contours. While the original algorithm identifies the eye pair with almost no error when running on a desktop computer with a fixed camera. Some of this work exploits accelerometers on the phone in order to infer gestures.

Leave A Reply

Your email address will not be published. Required fields are marked *