center portion of pupil and then uses this information to move the mouse cursor accordingly for driving the interface. The system has ... Index Terms: Eye tracking, Pupil detection, Blink detection, mouse movement. .... is used to call the system.
JOURNAL OF COMPUTING, VOLUME 4, ISSUE 3, MARCH 2012, ISSN 2151-9617 https://sites.google.com/site/journalofcomputing WWW.JOURNALOFCOMPUTING.ORG
50
Eye Tracking System with Blink Detection Sidra Naveed, Bushra Sikander, and Malik Sikander Hayat Khiyal Abstract— This paper presents an efficient eye tracking system having a feature of eye blink detection for controlling an interface that provides an alternate way of communication for the people who are suffering from some kind of severe physical disabilities. The proposed system uses pupil portion for tracking the movement of eyes .The idea of this technique is to track eye movements by detecting and locating the center portion of pupil and then uses this information to move the mouse cursor accordingly for driving the interface. The system has shown successful results on the recorded cam videos as expected and accurately detects eye blinks whether voluntary and involuntary. The system is totally non‐intrusive as it only uses a video cam for recording a video, so it is more users friendly and easier to configure. The experimental results have proved that the proposed system is better in its performance and efficiency than the current systems in practice. Index Terms: Eye tracking, Pupil detection, Blink detection, mouse movement.
—————————— ——————————
1. INTRODUCTION n the recent years due to the rapid advancement in technology there has been a great demand of human computer interfaces (HCIs). Many systems have already been developed for the normal people who have the ability to perform any action voluntarily but there was a need to develop such systems for the people who are only able to perform any involuntary action. The only action that the disable people can perform voluntarily is the blinking of their eyes. Because of this demand, there is an increased development of HCI systems based on eye biometrics. The need of the system for physically disable people motivated many researchers to develop eye tracking systems for providing an ease of use for those handicap users. For such users, eye tracking system is like a substitute of their abnormal physical behavior. Such system allows an interaction between the human and computers and these systems can easily be used even by the people who cannot speak and write. Eye tracking systems use image processing techniques
I
based on eye biometrics. In image processing the input data is converted into digital form and various mathematical operations are applied to the data to create a more enhanced image to perform tasks like recognition or authentication, and these tasks are performed by humans using digital computers. The ————————————————
Sidra Naveed is an under graduate student of Department of Software Engineering, Fatima Jinnah Women University The Mall, Rawalpindi, Pakistan. Bushra Sikander is Lecturer at the Department of Computer Sciences, Fatima Jinnah Women University The Mall, Rawalpindi, Pakistan. Dr. Malik Sikandar Hayat Khiyal is Professor and Head of Academic (ES), APCOMS, Khadim Hussain Road, Lalkuti, Rawalpindi, Pakistan.
process is also called picture processing. In image processing the picture is analyzed to identify shades, colors, contrasts and shapes of the picture that cannot be perceived by the human eyes. [1] There are many systems and applications that are based on human eye tracking .Various kinds of human computer interfaces[2] exists that make use of human eye movements and eye blinking. Some interfaces make use of eye movement for controlling mouse cursor, some systems track eyes to check the drowsiness of the driver during a drive, some applications makes use of eyes for typing a web address, and eyes are used for many vision based interfaces. Many eye tracking techniques are also used in medicine [3] [4] and optometry to diagnose certain diseases. Without considering the human vision system we cannot think of image processing. With our visual system we observe and evaluate the images that we process. Eye is an organ that is use for vision and light perception. It’s like a camera having an iris diaphragm with variable focusing .It is used for seeing or vision and has the ability to make intellectual judgments. It’s a specialized light sensitive structure that is used for forming images of sight. Human eye is defined as the spheroid structure that is located on the front side of skull and rests in a bony cavity. It’s spherical in shape with an average diameter of 20 mm. It is surrounded by 3 membranes. These 3 membranes are: Cornea and sclera which is the outer layer, Choroid, Retina which encloses the
JOURNAL OF COMPUTING, VOLUME 4, ISSUE 3, MARCH 2012, ISSN 2151-9617 https://sites.google.com/site/journalofcomputing WWW.JOURNALOFCOMPUTING.ORG
eye. The original captured eye image from a video is shown in figure 1.
Figure 1. Original eye image The colored part of the eye is called the iris. It’s circular in shape and in variable sizes. Iris regulates the amount of light entering into the eye by adjusting the size of the pupil. It can be in various colors like green, brown, blue, hazel, grey etc. Like finger prints, the shape, color and texture of each individual is unique. Pupil is the opening through which the light and pictures (images) that we view, enters the eye. Iris forms the pupil. With the size of the iris the size of the pupil increases or decreases correspondingly. There are many systems and applications that are based on human eye tracking and blink detection features. ITU is a commercial gaze tracker and its main aim is to provide a low cost gaze tracker that is easily accessible. IT University of Copenhagen [5] has developed it. COGAIN communication by gaze interaction association has supported it. Another eye tracking system is the ETD (eye tracking device).It is a device that is head mounted and it is use to measure the 3 dimensional eye and the head movements. The device was first designed by Prof. Dr. Andrew H. Clarke with the cooperation of companies like choronos vision and mtronix and was originally developed by German space agency (DLR) [6]. The device was developed for using in international space station (ISS) and in early 2004 it was uploaded to the station. Eye tracking systems are also use in vehicles to check the drowsiness and attention of the driver during driving and when it detects snooze of the driver it generates a warning alarm. A camera is associated with these eye tracking systems and are integrated into automobiles to assess the behavior of the driver. These systems are used to check the speed of the vehicle, the distance of the vehicle from the other and are also used to check whether the vehicle is in its driving lane or not. A camera is mounted in a
51
car that is angled towards the driver face and then the blink patterns of the driver are recorded by using the camera. By using these blink patterns the percentage of the time spent on engaging in driving can be measured. In 2006 Lexus have equipped its first diver monitoring system in its car LS460 [7].The system generates a warning if the driver takes his/her eyes off the road. Vision trak eye tracking system is the most advanced system in the world that accurately tracks that what the person is looking at. System was developed by ISCAN .Both desktop and head mounted versions are available for this system. The desktop version of vision trak [8] is binocular desktop 300 have many applications. iMotion attention tool [9] is an attention tracking software based on human eye tracking that combines eye tracking metrics, reading and emotion metrics. Unique behavior of consumer is given by combining these metrics. Another low cost eye tracking system is the open eyes toolkit [10]. This toolkit includes algorithm that are used to measure the movement of eyes from the digital videos. Eye tracking systems are also used in web usability. User plus [11] is a software program for free usability testing. It’s a beta system and it shares knowledge of usability with designers, developers and usability specialists across the network. Eye tracking systems are also used in laser refractive surgery. The STAR S4 IR Excimer Laser System [12] is the most advanced laser vision system. It reduces the effect of laser on the eye cornea and thus increasing the safety of the patient. One of the applications of eye tracking system is in language reading. In these systems eye tracking is used to investigate human language development, language skills and reading behavior. Tobbi remote eye trackers are used for this purpose. Tobii TX300 eye tracker [13] is a remote eye tracker with very high accuracy and is able to cater for large head movements and is used in language reading using eye tracking methods. 2. LITERATURE REVIEW Human eye is one of the most important and prominent feature on the face that also shows many useful information besides facial expressions. By detecting the position of the eye many useful interfaces can be developed. Several researches have been made for developing intrusive and non intrusive eye tracking systems. Intrusive eye tracking systems are those in which there is a direct contact with eyes and non intrusive eye tracking systems are those in which there is no physical contact with the user.
JOURNAL OF COMPUTING, VOLUME 4, ISSUE 3, MARCH 2012, ISSN 2151-9617 https://sites.google.com/site/journalofcomputing WWW.JOURNALOFCOMPUTING.ORG
52
Eye can also be tracked by using two interactive
Studies have been conducted on different eye tracking techniques, during the analysis phase of this research. The study has showed many advantages as well as disadvantages of different eye tracking techniques. The commonly used techniques are: limbus tracking, iris tracking and electroculography technique.
closed eye (Junwen Wu et.al), [16]. Initial eye position
In iris tracking, the motion and direction of iris is
describing state transition. Classification based model
detected for designing and implementing an eye
is used for measuring observations. Regression model
tracking system for developing a human computer
is used in tensor subspace to give the posterior
interface (Shazia et.al) [14]. In this technique batch
estimation. Performance is measured in two aspects:
particle filters, one for the open eye and other for the is located using eye detection algorithm, and then these filters are used for eye tracking and blink detection. Auto regression models are used for
mode is used for iris detection. The system allowed
blink detection rate and the tracking accuracy. Videos
the users to interact with the computer system by
from varying scenarios are used to evaluate blink
using their eye movements. The system accurately
detection rate. Whereas tracking accuracy is measured
located and detected the eyes in images with different
using benchmark data collected with the Vicon
iris positions and used this information to move the
motion capturing system. Particle filters have the
mouse cursor accordingly .As this iris tracking
advantage that with sufficient samples, the solution
method has been conducted on static images so it
approaches the Bayesian estimate. The proposed
provided a higher degree of accuracy. The developed
algorithm is able to accurately track eye locations and
system is restricted to work only when the direction of
detect both voluntary long blinks and involuntary
iris is left, right or center. It doesn’t work when the
short blinks. Normalizing the size of the images is
position of iris is up or down. The system is not
crucial in‐case of subspace based observation model.
expanded to work in real time and is not able to
If there is a bad scale transition model it can severely
handle blinks and close eyes.
affect the performance.
Another technique that was analyzed for tracking and
User’s eyes can also be located in the video sequence
detecting eye blinking is statistical Active Appearance
by detecting the eye blink motion (Kristan Grauman
Model (AAM) (Ioana Bacivarov et.al ), [15]. The
et.al) [17]. Initial eye blink is used to locate the eyes.
model offers a 2D model that quickly matches the
The algorithm detects the eye blinks, measure its
texture and shape of the face. By using Active
duration and then this information are used to control
Appearance Model a proof‐of‐ concept model for the
the non intrusive interface. The “Blink link” prototype
eye region is created to determine the parameters that
is used to call the system. By considering the motion
measure the degree of eye blinks. An initial model
information between two consecutive frames and
that employs two independent eye regions is then
determining that if this motion is caused by blink,
expanded using component based techniques. After
eyes are located. Then the eye is tracked and
developing an eye model, a blink detector is
monitored constantly. This system is a real time
proposed. The main advantage of using AAM
system that can consistently runs at 27‐29 frames per
technique is that the detailed description of the eye is
second. It is a completely non intrusive system that
obtained and not just its rough location. The system is
doesn’t required manual initialization or special
able to synthesize all the states in between, facilitating
lighting. Voluntary and involuntary blinks can be
the blink parameters extraction. The main drawback
classified reliably by the system. The disadvantage of
of AAM technique is that it is designed to work for a
this system is that it can only handle long blinks and
single individual and also the blink parameters have
is not able to handle short blinks. In case of short
to be identified in advance. For large variation of
blinks it just avoids the blinks.
pose, plan rotation etc. the conventional statistical model performs poorly.
JOURNAL OF COMPUTING, VOLUME 4, ISSUE 3, MARCH 2012, ISSN 2151-9617 https://sites.google.com/site/journalofcomputing WWW.JOURNALOFCOMPUTING.ORG
53
Vision based system (Aleksandra Krolak,et.al ),[18]
With the growth of technology many systems have
has also been used for detecting long voluntary eye
been developed for the people who have normal
blinks and interpret these blink patterns for the
physical functionalities but these systems cannot be
communication between man and machine. In this
used by the handicaps. So the need for the systems
presented model, statistical approach is used
has been demandingly increased by the people who
calculated on the basis of Haar‐like masks. Templates
have some kind of severe physical disabilities to
of different size and orientation are convolved with
provide these people an alternated way of
the image to compute Haar‐like features. By sliding
communication with the computer machines. So to
the search window of the same size as the face images
fulfill this demand for the physically disabled people,
used for training through the test image, face
the normal eye tracking system have been expanded
detection is done. In the next step eyes are localized.
to work in real time. This research proposed a system
The extracted image of eye region is further processed
that can correctly analyze the fact that infers accurate
for eye blink detection. By using normalized cross
tracking and precise blink detection. The system is
correlation method the detected eyes are tracked. The
also able to tackle for eye blinks. The information
main advantage of this system is that it doesn’t
from the human eye movement is used to drive the
required prior knowledge of face location or skin
interface.
color, nor a special lighting. Study of all the above systems has shown that these systems have both advantages and disadvantages. All these systems were developed to work on static images and they were not expanded to work in real
By following the above mentioned steps the resultant solution is as follows: Resultant_ Image = MM ( BD( CC( PS( F( D( E( S( B( U( G( R( I))))))))))))
time. Some of the systems were able to handle only
The variables used in the above formula are described
voluntary long blinks and they were not able to cater
in table 1.
for short blinks. Some systems were designed to work Table 1. List of factors
for the single individual and also blink parameters have to be identified in advance. For large pose variations these systems performed poorly. The goal of this research is to develop a system that can be expanded to work in a real time and that can correctly analyze the fact that infers accurate tracking and precise blink detection that is based on pupil tracking. The proposed system can detect both spontaneous and voluntary eye blinks and it doesn’t required prior knowledge of face location or skin color, nor special lighting.
3. SYSTEM DESIGN This section shows the Mathematical model and system design of the developed system that shows the interrelationships of each phase and relationships among different variables of the developed system.
3.1. Statement of the modeling problem
Symbols
Description
I
Iris portion
R
Read image
U G B S E D F PS CC BD MM
Uint8 conversion Gray scale conversion Binary image Smoothing filter Edge detection Dilation process Fill holes Pupil segmentation Center point comparison Blink detection Mouse movement
3.2. System design In the first step of the system design, frames are acquired from the recorded videos and are stored in a data base from where they are retrieved in a batch
JOURNAL OF COMPUTING, VOLUME 4, ISSUE 3, MARCH 2012, ISSN 2151-9617 https://sites.google.com/site/journalofcomputing WWW.JOURNALOFCOMPUTING.ORG
mode for further processing. The acquired images are
54
the images have been smooth to get the desired
sharpened to enhance the image details in it and to
patterns in an image. Average filter has been applied
remove the unnecessary details, by applying image
on images for smoothing the data. It smoothes the
enhancement techniques. After enhancing the image,
data by eliminating the noise. Using each gray level
edge of the area of interest (in this case iris and pupil)
values, average filter performs spatial filtering on each
is detected to extract the portion of pupil from the
individual pixel. The requirement i.e. to have less
enhanced image. After segmented the pupil portion,
detail in an image (blurred image) has been achieved
eye blink is detected. Center point of the pupil is
by eliminating the extra details and noise using
calculated in each frame and mouse is triggered
average filter. By increasing the size of filter mask, this
corresponding to these points.
filter makes the image more blurred. This helped in detecting prominent edges of the area of interest as
Each step is explained below:
more noise is being removed. Many other filters have also been applied on images but the average filter
A. Image acquisition
showed efficient and better results on the binary The first step in the proposed eye tracking system is
images for the proposed system and is shown in
acquiring the images. These frames have been
figure 3
captured from a recorded video and are stored in the permanent storage device from where they have been retrieved one by one for further processing and are converted into grayscale format. As the preprocessing of the image is easy if they are in binary format and the pupil portion is more prominent in binary as compared to gray scale so these images have been converted into binary. By converting the images in binary format it gives resulted images in which pupil portion is shown in black and the remaining portion of image is white thus highlighting the area of interest ( pupil) that makes the task of eye tracking more
Figure 3. Smoothen image
C. Edge detection
easier. The acquired original RGB image is shown in
After smoothing the images the next step is to detect
figure 2.
the edges of the area of interest (in this case pupil portion) to extract the portion of pupil from the enhanced images. Canny filter has been applied on images to prominent the edges. This filter first smoothes the edges and then it highlights the sharp changes in image brightness to get important information from the image. Many other filters have Figure 2. Original RGB image
B. Smoothing process In image processing, smoothing is defined as the process of capturing important patterns from the image. So after acquiring images, in the second step
been applied on the images to sharpen the edges but for the proposed system, canny filter produced the desired results presented in figure 4.
JOURNAL OF COMPUTING, VOLUME 4, ISSUE 3, MARCH 2012, ISSN 2151-9617 https://sites.google.com/site/journalofcomputing WWW.JOURNALOFCOMPUTING.ORG
55
Figure 4. Detection of edges
Figure 5. Dilated image
D. Pupil segmentation After getting a clear edge of the desired region i.e. pupil, it has become necessary to track the movement of human eye. Pupil portion has been used for tracking the eye movement. Dilation operation has been performed on the images to make the pupil portion more prominent. After applying the
Figure 6. Filled pupil
morphological operation as illustrated in figure 5, the only hole that is highlighted in the image is the hole of
the pupil. This pupil hole has been filled with the white color to make the pupil prominent as shown in figure 6. After identifying the pupil portion the next step is to segment this portion from the rest of the image. This is achieved by detecting the center points of the pupil
in images. By filling holes in the images the resulting images have the portion of pupil in white color and
Figure 7. Pupil Center point
the rest of the image portion in black. The only portion that contains maximum number of white pixels is the pupil portion so by finding the column with maximum white pixels and by calculating its center point will give center point of pupil as shown in figure 7. Radius of pupil is calculated and by calculating the coordinates using starting and ending value of the column having maximum white pixels and the pupil radius, the pupil portion has been segmented shown in figure 8.
Figure 8. Segmented pupil
E. Center point comparison for shifting The center point of pupil is located in the column where there is maximum number of white pixels. After calculating the center points of each image, these points are stored in 2‐D array with its x and y coordinates. After getting the center point of pupil in each image the next task is to compare these points in
JOURNAL OF COMPUTING, VOLUME 4, ISSUE 3, MARCH 2012, ISSN 2151-9617 https://sites.google.com/site/journalofcomputing WWW.JOURNALOFCOMPUTING.ORG
56
order to determine the shifting of pupil from one
implementation of an eye tracking system. The reason
image to another. This shifting value is used to control
for choosing MATLAB is that it has powerful graphics
the mouse movement accordingly. If the difference
and ease of use .Programs can be interpreted easily
between two center points is negative then mouse is
and debugging becomes easy. For analysis, data
moved in the right direction to the specific location. If
acquisition, image and signal processing, finance,
the difference is positive then mouse is moved in the
control design and simulation, MATLAB has a large
left direction to the specified location. If the center
number of additional applications. For understanding
point is not detected then the cursor will remain at the
and teaching basic mathematical engineering concepts
previous specified location.
and for simple mathematical manipulations with matrices, MATLAB is used largely. System was tested
F. Blink detection In order to detect the eye blink, the first step is to look
on sample images from different videos having different eye directions.
for the center point of the pupil. If the eye is open,
The original captured RGB image from a recorded
then center point of pupil is easily calculated as there
video is shown below in figure 9.
will be maximum number of white pixels of pupil somewhere in the image. If the eye is close then there will be no white pixels of pupil in the image and thus no center point is calculated. A count variable is created. If the eye is open and center point is calculated then the value of count variable is ‘1’ and if the eye is close then the count variable will be ‘0’ .If the count value in one image is 1 and if it is zero in the next image and vice versa then it means that there is
an eye blink. So by comparing the value of count
Figure 9. Original RGB image
variable, eye blink is detected. On the original sample image, binary operation is
G. Mouse movement The stored center points of pupil are used to trigger
applied for making the computation easy. Resulted binary image is given in figure 10.
mouse movements. These points are passed in the mouse function one by one thus resulting in the movement of mouse between these points. If the eye is close then there is no center point calculated and therefore no value is passed in the mouse function and the mouse cursor remained at the point where it lies before.
4. EXPERIMENTAL RESULTS
Figure 10. Binary image
The system has been tested for its accuracy and efficiency on many recorded videos. The results showed the system accuracy of approximately 90%.It was noticed that for accurate eye tracking, center point of pupil should be detected correctly. MATLAB R2009a version 7.8.0 was used for design and
Canny filter has been applied on the binary image in figure 10, to get clear and prominent edges as it highlights important information in the images. The resulted image having prominent edges is shown in figure 11.
JOURNAL OF COMPUTING, VOLUME 4, ISSUE 3, MARCH 2012, ISSN 2151-9617 https://sites.google.com/site/journalofcomputing WWW.JOURNALOFCOMPUTING.ORG
57
Figure 11. Edge detected image Figure 13. Center point of pupil After getting prominent edges, the next step performed was to highlight the edges to make them
After detecting the center point, radius of pupil is
more prominent and visible. Image dilation which is a
calculated by finding the distance from the start of the
morphological operation has been performed for
column having maximum number of white pixels to
making the edges more thick and prominent as given
the center point of pupil. This gives the radius of the
in figure 12.
pupil portion. This radius is then used to draw a pupil circle on original grayscale images. Radius of the pupil is given by the following equation: Radius= center point ‐starting point; Using the above calculated radius, a circle is drawn on the image using the following algorithm: re2 = round(2 * pi * rad); Where ‘rad’ is the radius of the pupil .Using this radius ‘rad’, re2 is calculated.
Figure 12. Image after morphological operation In the dilated image, the hole of the pupil is filled with white color to make it clearer. After filling the hole of the pupil, the next task was to calculate the center point of pupil for eye tracking. Center point is
for loop=0:step2:2*pi Calculate xx2 and yy2 and by using these values in image as an argument, it will draw a circle with radius ‘rad’ on the gray scale images. Midpoint is mapped on the image using the following steps:
calculated by finding the mid value of the column
Image (Center point‐1, Column) = 255;
having maximum numbers of consecutive white
Image (Center point, Column‐1) = 255;
pixels. Image with the marked center point of pupil is
Image (Center point, Column) = 255;
shown below in figure 13.
Image (Center point, Column+1) = 255; Image (Center point+1, Column) =255; Where image is the original gray scale image on which the center point has to be mapped and ‘column’ is the column having maximum numbers of consecutive white pixels. The gray scale image having pupil circle and the pupil midpoint mapped is shown in figure 14.
JOURNAL OF COMPUTING, VOLUME 4, ISSUE 3, MARCH 2012, ISSN 2151-9617 https://sites.google.com/site/journalofcomputing WWW.JOURNALOFCOMPUTING.ORG
58
Figure 14. Center detected image
Figure 16. Original RGB image
After getting a pupil portion with its center point, the
After applying binary operation on figure 16, the
next step is to segment this portion from the rest of
resulted image is shown in figure 17.
the image as shown in figure 15. Segmentation is done by calculating the four coordinates of the pupil portion to be segmented and is given as follows: r1=fstart1; r2=fend1; r3=Column ‐ rad; r4=Column +rad; pupil = imgg(r1:r2,r3:r4);
Where ‘fstart’ is the starting point of the column and ‘fend’ is the ending point of column having maximum
Figure 17. Binary image
numbers of white pixels. On the binary image shown in figure 17, edge is detected and is shown in figure 18.
Figure 15. Image of segmented pupil
The same steps are applied on other sample images from different recorded videos to get the pupil portion for eye tracking for controlling an interface. Original RGB image captured from another recorded video is shown in figure 16.
Figure 18. Edge detected image The filled pupil with its marked center point is shown in figure 19.
JOURNAL OF COMPUTING, VOLUME 4, ISSUE 3, MARCH 2012, ISSN 2151-9617 https://sites.google.com/site/journalofcomputing WWW.JOURNALOFCOMPUTING.ORG
59
movements efficiently and accurately by using the pupil portion and can accurately detect eye blinks whether voluntary and involuntary. The system can track eye portion with the 90% detection accuracy. The system is expanded to work in real time using recorded videos. The proposed system is purely non‐ intrusive as no hardware device has been attached to the human body so the system is user friendly and easier to configure. There are still some aspects of the
system that are under experimental conditions and development. But this project proved to be an overall
Figure 19. Center point of pupil
success and achieved the goals and requirements as Center point of pupil marked on gray scale image is shown in figure 20.
proposed in the system specifications. Many aspects of the system can be a part of the future work for making more efficient and robust eye tracking system. The system can be shifted from recorded videos to a live web cam video with some modifications, for making it a live system. The system can be developed in such a way so that it could also detect human eye gazes and act accordingly. There can be some kind of mouse action when the blink is detected. System efficiency can be achieved for making it a more efficient dynamic system.
REFERENCES
Figure 20. Center detected image The segmented pupil portion is shown in figure 21.
Figure 21. Segmented pupil
5. CONCLUSION AND FUTURE WORK This research provides a system that is able to trigger mouse movements for controlling an interface for the people who are suffering from some kind of severe physical disabilities and who cannot use the system with their hands. The system is able to track eye
[1] http://en.wikipedia.org/wiki/Image_processing, 19th June, 2011 [2] Alex Poole, Linden J. Ball “ Eye tracking in human computer interaction and usability research :current status and future prospects”, encyclopedia of human computer interaction 2006. pp.211‐219 [3] http://eyetrackingupdate.com/2011/06/27/eye‐ tracking‐astigmatism‐correction/, 28th June, 2011 [4] Filippo Galgani, Yiwen Sun, Pier Luca Lanzi, Jason Leigh “Automatic analysis of eye tracking data for medical diagnosis” , In Proceedings of CIDMʹ2009. pp.195~202 [5] http://www.gazegroup.org/downloads/23‐ gazetracker, 8th may ,2011 [6] http://en.wikipedia.org/wiki/Eye_Tracking_Devic e, 8th may,2011 [7] http://en.wikipedia.org/wiki/Driver_Monitoring_ System, 8th may,2011 [8] http://www.polhemus.com/?page=Eye_VisionTra k, 8th may,2011
JOURNAL OF COMPUTING, VOLUME 4, ISSUE 3, MARCH 2012, ISSN 2151-9617 https://sites.google.com/site/journalofcomputing WWW.JOURNALOFCOMPUTING.ORG
[9] http://www.objectivetechnology.com/market‐ research/software/imotions‐attention‐tool 8th may,2011 [10] http://thirtysixthspan.com/openEyes/, 8th may,2011 [11] http://eyetrackingupdate.com/2010/06/30/eye‐ tracking‐free‐web‐usability‐tools/ ,8th may 2011 [12] http://www.amo‐ inc.com/products/refractive/ilasik/star‐s4‐ir‐ excimer‐laser [13] http://www.tobii.com/analysis‐and‐ research/global/products/hardware/tobii‐tx300‐ eye‐tracker/ [14] Shazia Azam, Aihab Khan, M.S.H. Khiyal, “design and implementation of human computer interface tracking system based on multiple eye features” JATIT‐journal of theoretical and applied information technology, Vol.9, No.2 Nov, 2009. [15] Ioana Bacivarov, Mircea Ionita, Peter Corcoran, “Statistical models of appearance for eye tracking and eye blink detection and measurement “IEEE transactions on consumer electronics, Vol.54 , No.3, pp. 1312‐1320 August 2008. [16] Junwen Wu, Mohan M. Trivedi “simultaneous eye tracking and blink detection with interactive particle filters”, EURASIP Journal on Advances in Signal Processing, Volume 28, 17 pages, October 2007 [17] Kristen Grauman, Margrit Betke, James Gips, Gary R. Bradski “Communication via eye blinks‐ Detection and duration analysis in real time”, proceedings IEEE conf. on computer vision and pattern recognition, Lihue, HI, vol. 1, pp.1010, 2001 [18] Aleksandra Krolak, Pawel Strumillo “vision based eye blink monitoring system for human computer interfacing”,Advances in Human system interactions conference, pp. 994‐998, May 25‐27, 2008.
BIBLIOGRAPHY Sidra Naveed is the under graduate student of Department of Software Engineering in Fatima Jinnah Women University the Mall, Rawalpindi, Pakistan. Bushra Sikander is the Lecturer in the Department of Computer Science in Fatima Jinnah Women University the Mall, Rawalpindi, Pakistan. Her qualification is MS‐ CS..
60
Dr. Malik Sikandar Hayat Khiyal is Head of Academic (ES), APCOMS, Khadim Hussain Road, Lalkurti, Pakistan. He served in Pakistan Atomic Energy Commission for 25 years and involved in different research and development program of the PAEC. He developed software of underground flow and advanced fluid dynamic techniques. He was also involved at teaching in Computer Training Centre, PAEC and International Islamic University. His area of interest is Numerical Analysis of Algorithm, Theory of Automata and Theory of Computation. He has more than hundred research publications published in National and International Journals and Conference proceedings. He has supervised three PhD and more than one hundred and thirty research projects at graduate and postgraduate level. He is member of SIAM, ACM, Informing Science Institute, IACSIT. He is associate editor of IJCTE and coeditor of the journals JATIT and International Journal of Reviews in Computing. He is reviewer of the journals, IJCSIT, JIISIT, IJCEE and CEE of Elsevier.