Automatic Electrooculogram Classification for ... - Semantic Scholar

10 downloads 146 Views 567KB Size Report
[email protected]; webpage: http://mrinaltrikha.info/). Ayush Bhandari is a ... version 7.0. The PSD curve plotted for data shown in Fig. 7. shows that the ...
Automatic Electrooculogram Classification for Microcontroller Based Interface Design M. Trikha, Student Member, IEEE, A. Bhandari, Student Member, IEEE, and T. Gandhi Abstract—In this paper, we present a simple and novel technique for classification of multiple channel Electrooculogram signals (EOG). In particular, a viable real time EOG signal classifier for microcontrollers is proposed. The classifier is based on Deterministic Finite Automata (DFA). The system is capable of classifying sixteen different EOG signals. The viability of the system was tested by performing online experiments with able bodied subjects.

C

I. INTRODUCTION

LASSIFICATION of Electrooculogram (EOG) signals is a major concern for future development of multimode controllers, communication devices, and human computer interfaces [1 - 7]. This discipline of science and technology has a direct impact on the social well-being of human beings which fall into the strata of disabled and paralytic people. In particular, people suffering from Locked-in syndrome [8] (also known as Cerebromedullospinal Disconnection) have only control over their eye movements. This is the most prominent voluntary muscle activity in their body, and hence presents Electrooculogram as another possibility for use in development of such devices. In the past few years, the focus on the development of assistive devices for people with severe disability has increased by improving the traditional systems. The Video Oculography (VOG) system and Infrared Oculography (IROG), based on detection of eye position using cornea or iris, are some of the developments made so far [9]. A number of possible methods of eye movements with it’s advantages and disadvantages have been reviewed by scientists [10]. The physical energy drained in moving eyes being minimal when compared to other gestures such as nodding head (dumb people), speaking or writing etc. also inspires the use of EOG signals for device control purposes. Lately, there have been much effort to develop EOG based assistive devices [11, 12]. Manuscript received February 3, 2007. Mrinal Trikha is a student with the Department of Electronics and Communication, Jaypee Institute of Information Technology, A-10, Sector62, Noida - 201 307, Uttar Pradesh, India, (phone: 91-11-22241707; mobile: 91-9871104559; IEEE Student Member # 80682655; e-mail: [email protected]; webpage: http://mrinaltrikha.info/). Ayush Bhandari is a student with the Department of Electronics and Communication, Jaypee Institute of Information Technology, A-10, Sector62, Noida - 201 307, Uttar Pradesh, India, (mobile: 91-9818895300; e-mail: [email protected]). Tapan Gnadhi is a Research Associate with Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, 77 Massachusetts Avenue Cmbridge, USA, (mobile: 91-9911415834; email: [email protected]).

In a companion paper [13], we introduced a methodology for estimating the maximum number of instructions that a microcontroller can process when EOG signal is an input. In this paper, we present a novel scheme where the EOG signal acquired from single/multiple horizontal and vertical channel is preprocessed, de-noised and continuously fed to a comparator network for extraction of key symbols. These symbols (explained later) are fed into a Deterministic Finite Automata (DFA) for extraction of another set of symbols (explained later) which are fed into another DFA for classification of the EOG signal. Our scheme based on DFA has a major advantage of being suitable for easy development of embedded (and hardwired) systems for EOG based multimode controllers. II. EOG SIGNAL ACQUISITION A resting potential is generated by an electric dipole formed by a positive cornea and a negative cornea [14]. Electrooculography is a technique for measuring this resting potential. The resulting signal called EOG is essentially a record of the difference in electrical voltage between the front and back of the eye that is correlated with eyeball movement and obtained by electrodes placed on the skin near the eye. We used five silicon-rubber electrodes of impedance below 10 K-ohm, placed around the eye to obtain EOG signals. Due to the low impedance range starting from 40200 Ω, the silicon-rubber conducting electrode is more suitable to sense the very low amplitude bio-signals as compared to other types of electrodes such as Ag-AgCl electrodes. Horizontal EOG is measured as a voltage by means of electrodes strategically placed as close as possible to the canthus of each eye. Similarly, vertical EOG is measured as a voltage by means of electrodes placed just above and below the eye as shown in Fig. 1. The reference electrode is placed on the forehead. Vertical Channel (+) Horizontal Channel (-) Vertical Channel (-)

Reference Horizontal Channel (+)

Fig. 1. Electrode Placement for the present study

With the eye at rest, the electrodes are effectively at the same potential and no voltage is recorded. The rotation of

the eye to the right results in a difference of potential with the electrode in the direction of movement, becoming positive with respect to the second electrode. The opposite effect results from the rotation to the left, as illustrated in Fig. 2.

comparator and NOR Gate network for extraction of key symbols (here key symbols refer to digital output of the comparator and NOR Gate network) mapped onto set C ∈ {CV +, CV −, CVo, CH +, CH −, CHo} . The symbols in this set correspond to different voltage levels in the EOG signal, which are shown in Fig. 5. Here VH and VV are amplitude of signal in vertical and horizontal channel respectively. THH, THL, TVH, and TVL are High/Low threshold levels in Horizontal/Vertical channel EOG signal respectively. We used Threshold levels as +/- 37.5% (This value was determined statistically while performing online experiments using LAB VIEW version 7.0, and may vary depending on the choice of electrodes and other factors such as sweat on the surface of skin etc.) of the highest positive/negative amplitude of the obtained EOG signal. Symbols {CV +, CV −, CH +, CH −} are positive/negative region above/below threshold levels in vertical/horizontal channel respectively. Symbols {CVo, CHo} are rest zone

Fig. 2. Signal corresponding to angle of eye Rotation

EOG signal obtained by horizontal (left-right) eye movement is shown in Fig. 3. Visual analysis of EOG signal confirms the occurrence of well defined peaks. These peaks are very clear and can be detected using thresholding. 1

areas in vertical/horizontal channel respectively. These signals are fed as primary input to the Peak Detection Deterministic Finite Automata (PDDFA), which based on these signals identifies positive/negative peaks and rest zone in EOG signals obtained from horizontal/vertical channel. The block diagram for this process is shown in Fig. 6.

Amplitude

0.5 0 -0.5 -1

0

2

4

6 Time (Sec)

8

10

12

Fig. 3. EOG signal obtained from Horizontal eye movement

Fourier Amplitude

300

Fig. 5 (a). Voltage Ranges in Levels in EOG Signal (Horizontal Channel) 200

100

0 -50

-40

-30

-20

-10

0 10 Frequency (Hz)

20

30

40

50

Fig. 4. FFT of raw EOG signal obtained from Horizontal eye movement

We performed analysis of EOG signals using LAB VIEW version 7.0. The PSD curve plotted for data shown in Fig. 7. shows that the prominent frequency components are up to 40Hz maximum where the maximum frequency components lie around 4Hz. Fig. 4. depicts the Fast Fourier Transform (FFT). III. EOG CLASSIFICATION PROCESS FLOW The EOG signal acquired from horizontal and vertical channel is preprocessed and de-noised to remove all unwanted frequencies. Then the de-noised signal is fed to a

Fig. 5 (b). Voltage Ranges in Levels in EOG Signal (Vertical Channel)

The output of PDDFA is mapped onto set P ∈ {V +, V −, H +, H −} (explained under the heading ‘PDDFA’), used as input by the Movement Classifier Deterministic Finite Automata (MCDFA) to determine which eye-movement was performed by the user, from its set of sixteen different eye movements. This output is then used by the O/P module (shown in Fig. 6) to display which eye-movement was detected, on the LCD display.

NOR Gate

PDDFA

MCDFA

O/P

LCD

De-Noising

V. Channel

Pre Processing

H. Channel

µ - Controller

Comparators NOR Gate Fig. 6. Block Diagram for EOG Classification Process Flow for a Microcontroller device

The functions governing digital values of the elements in set C are shown in Table I. (only conditions for ‘Digital Logic 1’ are shown, value is ‘Digital Logic 0’ for all other cases). TABLE I DIGITAL VALUES IN SET C (Comparator and NOR Gate network output) Element of set C

Condition for which value is (Logic 1).

CV +

VV ≥ TVH

CV −

VV ≤ TVL

CVo

TVL < VV < TVH

CH +

VH ≥ THH

CH −

VH ≤ THL

CHo

THL < VH < THH

IV. PDDFA The Deterministic Finite Automata, PDDFA = ( Q1 , Σ1 , δ1 , q01 ) , was developed for triggering the detection of occurrence of positive/negative peaks and rest zone in horizontal/vertical channel of the EOG signal. The DFA is shown in Fig. 7. Here, Q1 is the set of all states of finite automata, shown using with the name of state written inside it, Σ1 is the set of input signals Σ1 ∈ {C} , where C is the set of all outputs from the comparator and NOR Gate network, C ∈ {CV +, CV −, CVo, CH +, CH −, CHo} (explained under the heading ‘EOG CLASSIFICATION PROCESS FLOW’). δ1 is the set of all transition functions represented by arrows, if a transition function contains an element of set P ∈ {V +, V −, H +, H −} (explained under the heading ‘MCDFA’) preceding with an “,” then that element from set P is used as input by the MCDFA. q01 represents the start state as indicated in Fig. 7.

PH+ PV+ PH+

PV+

PH+ PVPHPV+

PV-

PH-

PHPV-

Fig. 7. Peak Detection Deterministic Finite Automata

V. MCDFA The Movement Classifier Deterministic Finite Automata, MCDFA = ( Q2 , Σ 2 , δ 2 , q02 , F2 ) , was developed for classification of sixteen different EOG signals, acquired from horizontal and vertical channels (signal comprises of potential generated due to motion of eye with 2 degrees of freedom horizontal/vertical). The sixteen different eye movements with their abbreviations as shown in Fig. 8. Before performing any of these sixteen eye movements the eye should be centered such that both EOG signals obtained from horizontal/vertical channel lie in the horizontal/vertical rest zone respectively.

UP

DOW N

UP DOWN

DOWN UP

UL

S4

UR

V+H+ V-

S3

V+

V+H+ H-

S1

UL DR

V+HV-

V+ H+ VV+ H-

V+HH+

VH+

V-H+ V+

UR DL

DL UR

H+ VH-

V-H+ HDR UL

S2 V-HV+

H-

V+H+ H-

LEFT

RIGHT

LEFT RIGHT

RIGHT LEFT

DR

DL

Fig. 8. Movement Classifier Deterministic Finite Automata TABLE II SIXTEEN EYE-MOVEMENTs Abbreviation UP DOWN LEFT RIGHT

Eye Movement Up Down

The eye movements shown inside curly braces using comma separated list in Table II. (column ‘Eye Movement’) must be performed simultaneously. In MCDFA , Q2 is the set of all states of finite automata, shown using

Right

signals, where P is the set of output signals generated by PDDFA P ∈ {V +, V −, H +, H −} , and T is the set of all output signals generated by timeout timers T ∈ {TV , TH , TD} , here TV , TH , TD are vertical, horizontal,

UP DOWN

Up then Down

DOWN UP

Down then Up

LEFT RIGHT

Left then Right

RIGHT LEFT

Right then Left

or

. Σ 2 ∈ { P, T } is the set of input

Left

and diagonal timeouts respectively. δ 2 is the set of all

UL

Up and Left simultaneously

UR

Up and Right simultaneously

transition functions represented by arrows, q02 represents

DR

Down and Right simultaneously

the start state as indicated in Fig. 8., F2 represents the set of all finish states shown by , with the name of state written inside it. All intermediate states between start and final states are divided into four groups {S1 ..S 4 } . When a transition to state S1, S2, or S3 occurs the horizontal, vertical, or diagonal timeout timer is enabled respectively.

DL

Down and Left simultaneously

UL DR

(Up, Left) then (Down, Right)

UR DL

(Up, Right) then (Down, Left)

DL UR

(Down, Left) then (Up, Right)

DR UL

(Down, Right) then (Up, Left)

(( ((

) )

)

)

δ 2 δ 2 δ 2 ( (δ 2 ( q02 , H − ) → qH − ) , V + ) → qV + H − , V − → qV + H −V − , H + → qUR → DL Fig. 9. Mathematical representation of (down, left) → (up, right) eye movement in context to MCDFA

MCDFA works by taking a stream of inputs symbols (here elements of set Σ 2 are referred to as symbols). It accepts or rejects an EOG signal as follows. Starting in the start state, for each input symbol the automaton transitions to the next sate following the transition function δ 2 ( p, a ) = q , where p is the present state, a is the current input, and q is the next state. After making n (n ∈ {2,3, 4}) transitions for an n-symbol EOG signal, if the automaton is in a final state, then it accepts the EOG signal. If it is not in a final state, or if at some point there was no appropriate input (i.e. no labeled edge to follow), it rejects the EOG signal, and resets the Automata to start state. In Fig. 8. the sequence for detection of diagonal (down, left ) → (up, right ) eye movement is illustrated by highlighting all visited states and transitions by increased line width. Mathematically the input and transition sequence is represented as shown in Fig.9. VI. IMPLEMENTATION ON A MICROCONTROLLER We chose PIC16F877A [15] as the microntroller for implementation of the aforementioned EOG Classification System. The transition δ1 ( p, a) = q from state to state is completely determined by the input a and initial state p . The algorithms merely follow the correct branches based on the information provided. A. For PDDFA state transitions with and without output

output from the Comparator and NOR Gate network, < a > belongs to set C . Lines 2 - 4 in Fig. 10 and lines 2 - 6 in Fig. 11 must be repeated for each possible transition from state < p > of PDDFA. < P > in Fig. 11 is the output from PDDFA. < P > belongs to set P . Line 4 in Fig. 11 transfers program control from PDDFA module to MCDFA module which uses < P > as input for its state transition functions. And after a successful state transition returns control back to PDDFA module. B. For MCDFA state transitions to intermediate and final states. 1. 2. 3. 4. 5. 6. 7.

MCDFA_

: IF PDDFA_OP = THEN MCDFA_STATE ← MCDFA_ RETURN ENDIF MCDFA_STATE ← MCDFA_START RETURN

Fig. 12. Algorithm for MCDFA state transition to intermediate states 1. 2. 3. 4. 5. 6. 7. 8.

MCDFA_

: IF PDDFA_OP = THEN GOSUB EYE_MOVEMENT_ MCDFA_STATE = MCDFA_START RETURN ENDIF MCDFA_STATE = MCDFA_START RETURN

Fig. 13. Algorithm for MCDFA state transition to final states 1. 2. 3. 4. 5.

PDDFA_

: IF input = THEN GOTO PDDFA_ ENDIF GOTO PDDFA_



Fig. 10. Algorithm for PDDFA state transition without output 1. 2. 3. 4. 5. 6. 7.

PDDFA_

: IF C = THEN PDDFA_OP ←

GOSUB MCDFA GOTO PDDFA_ ENDIF GOTO PDDFA_



Algorithms for MCDFA State Transitions to intermediate and final states are shown in Fig. 12 and Fig. 13 respectively. Here, < p > , and < q > are the names of initial and final states for a given state transition in MCDFA. < a > is the output from the PDDFA and belongs to set P . Lines 2 - 5 in Fig. 12 and lines 2 - 6 in Fig. 13 must be repeated for each possible transition from state < p > of MCDFA. < q > in Fig. 13 is the detected eye movement. Module EYE_MOVEMENT_ is used for displaying the detected eye movement on the LCD Display (number).

Fig. 11. Algorithm for PDDFA state transition with output

VII. CONCLUSION Algorithms for PDDFA State Transitions with and without output are shown in Fig. 10 and Fig. 11 respectively. Here, < p > , and < q > are the names of initial and final states for a given state transition in PDDFA. < a > is the

To conclude, in this paper, we proposed a scheme for EOG signal classification which can be used as a mathematical model for development of EOG based medical instrumentation where device control is the main objective.

The sixteen classified signals provide a large range of input signals, enabling EOG to serve as a primary source of input to many multimode controllers. DFA being one of the most practical model of computation prove useful in development of linear-time, constant-space, online-algorithms (one that can process its input piece-by-piece, without having the entire input available from the start). The proposed scheme can be universally applied for development of embedded systems requiring real time EOG signals as primary input. REFERENCES [1] [2] [3]

[4]

[5]

[6]

[7] [8]

[9]

[10] [11]

[12] [13] [14] [15]

D. Kumar, E. Poole, “Classification of EOG for Human Computer Interface,” in Proc. 2002 Proceedings of the Second Joint EMBS/BMES Conference Houston, TX. USA. G. Norris, E. Wilson, “The eye mouse, An Eye Communication Device,” in Trans. 1997 IEEE SMC., pp. 66-67. Y. Kuno, T. Yagi, Y. Uchikawa, “Development of eye pointer with free head-motion,” in Proc. 1998 IEEE Proceedings of the 21st Annual International Conference of the Engineering in Medicine and biology Society, Vol20. S. H. Kwon, H. C. Kim, “EOG-Based Glasses-Type Wireless Mouse for the Disabled,” in Proc. 1999 IEEE Proceedings of the First Joint BMES/EMBS Conference Serving Hzirnaniq, Advancing Technology, Atlanta, GA, USA. Qiuping Ding, Kaiyu Tong, and Guang Li, “Development of an EOG (Electro-Oculography) Based Human-Computer Interface”, Proc. of the 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference, pp 6829- 6831, 2005 Youngmin Kim Nakju Doh Youngil Youm Wan Kyun Chung, “Development of Human-Mobile Communication System Using Electrooculogram Signals”, Proc. of 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, vol.4, pp 2160-2165, 2001 Patmore David W and Knapp R Benjamin, “Towards an EOG-Based eye tracker for Computer Control”, Annual ACM Conference on Assistive Technologies, Proceedings, 1998, PP.197-203. “Locked-In Syndrome Information Page: National Institute of Neurological Disorders and Stroke (NINDS)”. Available at http://www.ninds.nih.gov/disorders/lockedinsyndrome/lockedinsyndro me.htm (April 1, 2007). Jason J. Gu, Max Meng, Albert Cook, M.Gary Faulkner, ”A Study of natural eye movement Detection and Ocular Implant movement control using processed EOG signals”, Proceedings of the 2001 IEEE, International Conference on Robotics and Automation , Seoul, Korea, May 21-26,2001,pp592. Davison, H. (1980), Physiology of the eye, New York: Pergamon Press, 1990. R.Barea, L.Boquete, L.M.Bergasa, E.Lopez, M.Mazo, ”ElectroOculograpic Guidance of a wheelchair using eye movement’s codification”, The International Journal of Robotics Research, vol.22, No.7-8, July-August 2003, pp.641-652. Dinesh Kumar, Eric Poole,” Classification of EOG for Human Computer Interface”, Proceedings of the Second joint EMBS/BMES Conference Huston, TX, USA, October 23-26,2002,pp.64-67. Ayush Bhandari, Mrinal Trikha, Vijay Khare and Sneh Anand, “Wavelet Based Novel Technique for Signal Conditioning of EOG Signals”, Proc. of IEEE Indicon 2006. R. H. S. Carpenter, Movements of the Eyes, 2nd ed., Pion, London, 1988. “PIC16F877A”. Available at http://www.microchip.com/stellent/ idcplg?IdcService=SS_GET_PAGE&nodeId=1335&dDocName=en0 10242 (April 9, 2007).