Untitled

2 downloads 0 Views 2MB Size Report
Aug 1, 2016 - AmiHEALTH chairs ([email protected], ramon.hlucas@uclm.es) with questions regarding. AmiHEALTH. Vladimir Villarreal & Ramón ...
8/1/2016

Correo de Universidad Autónoma de Baja California - UCAmI 2015, IWAAL 2015 & AmIHEALTH 2015 notification for paper 89

Victoria Meza 

UCAmI 2015, IWAAL 2015 & AmIHEALTH 2015 notification for paper 89  [email protected]  Responder a: [email protected], IWAAL 2015 & AmIHEALTH 2015 Para: Victoria Meza­Kubo 

13 de agosto de 2015, 3:06

Dear Victoria Meza­Kubo,  We are very glad to inform you that your paper:  Number: 89  Title: Processing brain biometric signals in order to assess the user experience  has been accepted as FULL PAPER for publication and presentation at the 1st International Conference on  Ambient Intelligence for Health (AmIHEALTH 2015).  Please revise the paper carefully according to reviewers’ comments, if any, and upload the camera ready  version via Easychair before the SEPTEMBER 13th.  Make sure that the manuscript complies the LNCS FORMAT GUIDELINE:  http://mami.uclm.es/ucami­iwaal­amihealth­2015/cfp.html Remember, as FULL PAPER your manuscript should not exceed the 12 PAGES LIMIT, including figures and  appendices.  In order to include your paper in the Springer proceeding, we need that you carefully follow these 3  MANDATORY instructions:  1. UPLOAD the camera ready version of your paper. Camera ready papers will be uploaded as rtf, .doc or  latex. https://easychair.org/conferences/?conf=ucami­iwaal­amihealth­2015  2. UPLOAD, as a separate file, the copyright form, filled in all its parts and signed by the corresponding  author. Download the copyright form here: ftp://ftp.springer.de/pub/tex/latex/llncs/LNCS­  Springer_Copyright_Form.pdf  Please, fulfil all fields including:  ­ Title of the book: Ambient Intelligence for Health (vol 9456)  ­ Volume Editors: José Bravo, Ramón Hervás, Vladimir Villarreal.  3. Complete the registration for the conference by September 13th (at least one author per paper).  Registration may be completed online from  http://mami.uclm.es/ucami­iwaal­amihealth­2015/registration.html  You can find bellow the reviews’ comments.  Thank you for your cooperation, and do not hesitate to contact us for any further question.  Best regards,  * Please, do not reply to this mail account. You may contact General Chair ([email protected]) or  AmiHEALTH chairs ([email protected][email protected]) with questions regarding  AmiHEALTH.  Vladimir Villarreal & Ramón Hervás.  AmIHEALTH Chairs.  https://mail.google.com/mail/u/0/?ui=2&ik=c92b723d4f&view=pt&q=amihealth%202015&qs=true&search=query&msg=14f26868472b9f80&siml=14f26868472b9f80

1/3

8/1/2016

Correo de Universidad Autónoma de Baja California - UCAmI 2015, IWAAL 2015 & AmIHEALTH 2015 notification for paper 89

­­­­­­­­­­­­­­­­­­­­­­­ REVIEW 1 ­­­­­­­­­­­­­­­­­­­­­  PAPER: 89  TITLE: Processing brain biometric signals in order to assess the user experience  AUTHORS: Ivan Carrillo, Victoria Meza­Kubo, Alberto L. Morán, Gilberto Galindo and Eloisa García­Canseco  OVERALL EVALUATION: 1 (weak accept)  RELEVANCE TO THE UCAmI AUDIENCE: 3 (fair)  ORIGINALITY: 4 (good)  TECHNICAL QUALITY: 3 (fair)  LITERATURE REVIEW: 3 (fair)  EMPIRICAL SUPPORT: 3 (fair)  PRESENTATION: 3 (fair)  IS THE PAPER IN LNCS FORMAT?: 1 (Yes)  IS THE ARTICLE ACCEPTABLE AS?: 2 (Short Paper)  ­­­­­­­­­­­ REVIEW ­­­­­­­­­­­  The article describes a method for the automatic classification of emotions (e.g. pleasant vs unpleasant). The procedure is based on the use of EEG to train a neural networks to perform the classification.  The paper in principle has some potential, but also several weaknesses:  ­ The idea of combining EEG and neural networks to classify emotions has been investigated by other authors (which are not referenced in this paper). See for example Nasehi et al (2012):  http://www.wseas.org/multimedia/journals/signal/2012/54­870.pdf  ­ It is not clear what is the relation with user experience (as the title of the paper promises), since the authors did not test the emotional responses to a specific product or interface; they just used IAPS;  ­ The validation procedure involved only 8 participants; moreover, it is not clear why only elderly users were involved;  p.s.  "The brain has approximately 100 million neurons" did authors meant billions?  ­­­­­­­­­­­­­­­­­­­­­­­ REVIEW 2 ­­­­­­­­­­­­­­­­­­­­­  PAPER: 89  TITLE: Processing brain biometric signals in order to assess the user experience  AUTHORS: Ivan Carrillo, Victoria Meza­Kubo, Alberto L. Morán, Gilberto Galindo and Eloisa García­Canseco  OVERALL EVALUATION: 2 (accept)  RELEVANCE TO THE UCAmI AUDIENCE: 4 (good)  ORIGINALITY: 4 (good)  TECHNICAL QUALITY: 4 (good)  LITERATURE REVIEW: 4 (good)  EMPIRICAL SUPPORT: 4 (good)  PRESENTATION: 4 (good)  IS THE PAPER IN LNCS FORMAT?: 1 (Yes)  IS THE ARTICLE ACCEPTABLE AS?: 3 (Full Paper)  ­­­­­­­­­­­ REVIEW ­­­­­­­­­­­  This paper proposes a neural network to identify pleasant and unpleasant emotions from recorded electroencephalography signal.  This is a nice and interesting paper.  Before publish it, I suggest to improve section 2 Related work.  Furthermore I suggest to explain better Signal feature extraction and to underline the contribution of the research and future developments.  Check English.  https://mail.google.com/mail/u/0/?ui=2&ik=c92b723d4f&view=pt&q=amihealth%202015&qs=true&search=query&msg=14f26868472b9f80&siml=14f26868472b9f80

2/3

8/1/2016

Correo de Universidad Autónoma de Baja California - UCAmI 2015, IWAAL 2015 & AmIHEALTH 2015 notification for paper 89

­­­­­­­­­­­­­­­­­­­­­­­ REVIEW 3 ­­­­­­­­­­­­­­­­­­­­­  PAPER: 89  TITLE: Processing brain biometric signals in order to assess the user experience  AUTHORS: Ivan Carrillo, Victoria Meza­Kubo, Alberto L. Morán, Gilberto Galindo and Eloisa García­Canseco  OVERALL EVALUATION: 0 (borderline paper)  RELEVANCE TO THE UCAmI AUDIENCE: 3 (fair)  ORIGINALITY: 2 (poor)  TECHNICAL QUALITY: 3 (fair)  LITERATURE REVIEW: 2 (poor)  EMPIRICAL SUPPORT: 3 (fair)  PRESENTATION: 3 (fair)  IS THE PAPER IN LNCS FORMAT?: 1 (Yes)  IS THE ARTICLE ACCEPTABLE AS?: 2 (Short Paper)  ­­­­­­­­­­­ REVIEW ­­­­­­­­­­­  The paper explores the classification of human emotion using Emotive Sensor and Patternet.  The reach some acceptable results.  The paper lit review seems wanting. The work fails to acknowledge and review the many techniques use for emotion classification from eeg some reviews Sohabib et al., 2013, Kim et al, 2013, lee and Hsieh, 2014. 

https://mail.google.com/mail/u/0/?ui=2&ik=c92b723d4f&view=pt&q=amihealth%202015&qs=true&search=query&msg=14f26868472b9f80&siml=14f26868472b9f80

3/3

Processing EEG Signals Towards the Construction of a User Experience Assessment Method Ivan Carrillo(B) , Victoria Meza-Kubo, Alberto L. Mor´ an, Gilberto Galindo, and Eloisa Garc´ıa-Canseco Universidad Autn´ oma de Baja California, Ensenada, Baja California, Mexico {ivan.carrillo,mmeza,alberto.moran,gilberto.galindo.aldana, eloisa.garcia}@uabc.edu.mx

Abstract. This paper proposes a neural network to identify pleasant and unpleasant emotions from recorded electroencephalography (EEG) signals, towards the construction of a method to assess user experience (UX). EEG signals were obtained with an Emotiv EEG device. The input data was recorded during the presentation of visual stimulus that induce emotions known a priori. The EEG signals recorded were preprocessed to enhance the differences and then used to train and validate a Patternet neural network. The results indicate that the neural network provides an accurate rate of 99.61 % for 258 preprocessed signals. Keywords: EEG

1

· Emotions · Elderly · IAPS · Neural networks

Introduction

The increasing incidence of diseases such as the Alzheimer’s disease have moved researchers to look for alternative non-drug treatments, including technologies supporting cognition, that seek to maintain the cognitive status of the elderly through cognitive stimulation [1,2]. To this end, diverse intelligent environment applications that seek to promote cognitive stimulation (CS) have been proposed [2]. However, due to the characteristics of this group of users, caused by their decline in their physical and cognitive skills, it is necessary to assess what is the elderly perception regarding the use, acceptance and adoption of these applications. In the literature, several usability and user experience (UX) evaluations have been reported in order to assess the perception of the users regarding the use of technology [3], however, conducting this kind of assessments may be a difficult task due to the inherent limitations of the evaluation methods themselves. For example, it is well known that in techniques based on self-report, participants tend to respond what the researcher wants to hear, or tend not to be sincere and improve their perception of the results because they felt assessed, or because they have forgotten the details of their experience [4]. Because of this, the results of these evaluations may not be very reliable. c Springer International Publishing Switzerland 2015 ⃝ J. Bravo et al. (Eds.): AmIHEALTH 2015, LNCS 9456, pp. 281–292, 2015. DOI: 10.1007/978-3-319-26508-7 28

282

I. Carrillo et al.

As an alternative to traditional UX assessment techniques, we propose to record brain electrical activity of the participants by means of a low-cost electroencephalogram (EEG) device, to infer pleasant and unpleasant emotions in an automated manner using a Patternnet neural network. The goal is to introduce a method with which we could identify emotions using EEG signals and based on these data determine the UX. In this paper we present the neural network designed and the processing techniques used for emotions recognition towards the construction of a method to assess UX.

2

Related Work

There are several ways of recording psychophysiology data from humans, for example Galvanic Skin Response (GSR), Electromyography (EMG), Electrocardiogram (ECG) and Electroencephalography (EEG) [5]. Inferring emotional states from EEG has received considerable attention as EEG has rapid response time, could directly reflect emotional states and is less expensive than other related methods, so it is widely used to monitor brain activity in BCI research [6,7]. In recent years, there have been a growing number of efforts to recognize a person’s emotion in real time using EEG. For example, EmoRate developed as a commercial product (Emotiv Corp.) detects the flow of the emotional state while the user is watching a film [8]. Brown et al. proposed an EEG-based affective computer system that measures the state of valence and transmits it via a wireless link [9]. Petrantonakis [10] analyzed EEG signals and used neural networks to classify them in six emotions based on emotional valence and arousal, with a 61 % success rate. Chai [11] evaluates three mobile phone applications using three self-report questionnaires to obtain subjective data as well as recording brain activity using EEG of participants. These data were used to determine the positive and negative states in the UX of the applications used. Hakvoort [12] evaluated the UX of a brain-computer interface (BCI) game based on the expectation of users, they used an evoked visual response as stimuli, and an EEG to register physiological data. In a similar fashion, in this work we propose to identified pleasant and unpleasant emotions by means of a neural network fed with EEG data.

3

Emotions

Emotions can be positive or negative. At all times, no matter the context of the situation people experience a range of emotions whether positive (e.g. joy, gratefulness, sympathy, happiness, love, etc.) or negative (e.g. displeasure, irritability, disgust, anger, sadness, etc.) [13,14]. Positive emotions are associated with the activation of regions of the left hemisphere while negative emotions relate to the activation of regions of the right hemisphere [15–18].

Processing EEG Signals Towards the Construction of a User Experience

283

In the frequency domain, the spectral power in various frequency bands has been implicated in the emotional state. The asymmetry analysis of the alpha power is recognized as a useful procedure for the study of emotional reactivity [19]; further, it is common to find asymmetries in the frontal region of the brain, which may be perceived on a subject since childhood [20]. In a study conducted in [21] a spectral analysis of the electrical activity obtained through an EEG demonstrated that the alpha power varies depending on the emotion present (positive or negative). In order to determine experimental stimuli to induce emotions different modalities including the visual, auditory, tactile, or odor stimulation have been used. A recurrent technique to induce emotions is to use the standard stimulus sets such as the International Affective Picture System (IAPS) or the International Affective Digitized Sound System (IADS). The IAPS and IADS provide a set of normative pictures (IAPS) or sounds (IADS), for emotional stimuli to induce emotional changes and attention levels [22,23]. In this work the IAPS pictures have been used to induce emotions in the subjects; EEG signals are acquired using the Emotiv EEG device; and a Patternnet neural network was designed in order to identify pleasant an unpleasant emotions.

4

Methodology

In order to define a method that interprets a set of UX emotions using an EEG, we conducted a preliminary study that presents selected pictures to stimulate known a priori emotions and recorded the EEG response. Then, the EEG was filtered and processed to design a trained neural network used to identify the UX from two basic emotional states: pleasant or unpleasant. For the study we used the International Affective Picture System (IAPS). A subset of the pictures proposed in Bradley [23], which evoke specific emotions were used. The following categories of emotion in pictures were used (see Fig. 1): (A) fear (10), (B) pleasant (10), (C) unpleasant (10), and (D) neutral (29).

Fig. 1. Example of selected pictures from the IAPS. (A) Fear pictures. (B) Pleasant pictures. (C) Unpleasant pictures. (D) Neutral pictures.

284

I. Carrillo et al.

Participants. Participants were eight older adults, 2 male and 6 female, aged 60 to 83 (AVG 72.3 years, SD 8.46 years). Participants received coffee services and a gift (equivalent to approximately $5 dlls) for their participation in the study. Inclusion criteria were: aged over 60 years, not having suffered a head trauma, absence of moderate or severe cognitive problems and absence of visual problems (i.e. being able to see well without glasses at a distance of 30–50 cm). To determine that participants did not have cognitive problems, we applied the Mini Mental State Examination (MMSE). Materials – The Emotiv EEG headset was used to acquire EEG data; Emotiv EEG obtains and records brain activity through 14 electrodes (AF3, F7, F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4, F8, AF4). The electrodes were placed according to the International 10–20 System, which sets the position of the electrodes on the cranial surface corresponding to cortical areas. – Software; the Camtasia Studio Software was used to record the facial expressions of each participant; the Emotiv EEG Control Panel application was used to calibrate the device; the TestBench software was used to record data from brain activity with the 14 electrodes; the EEGExProc was used to display images; and the EEGLAB to process the EEG data [25]. Procedure – Introduction. First the participants where introduced individually to the experiment, and the characteristics of the Emotiv EEG headset were explained. They were also asked to sign a consent form, and the MMSE was applied. – Emotiv EEG calibration. For best performance of the device, this was calibrated for each participant by recognizing facial gestures and by manipulating a virtual 3D cube through brain interaction using the Emotiv EEG control panel application. – Brain signals acquisition. In this stage, each participant, wearing the Emotiv EEG device, was presented with a set of pictures according to the proposal in Bertron [24]. Pictures were presented in the following arrangement: pleasant, fear, unpleasant and neutral for 6 seconds each, and immediately, the participant was asked to indicate what was his/her impression upon seeing the picture answering 1 to 4 according to one of the following categories: pleasant, unpleasant, neutral and fear. – Signal Processing The process followed for reduction and analysis of the EEG is divided into four phases 1. EEG Capture. Brain electrical activity was captured using the Emotiv EEG device. The brain activity of each participant was recorded by the TestBench software and a label was inserted into the recorded signal indicating the category of emotion presented for the future signal segmentation. The EEG signal frequency acquired bands were: alpha, beta, and theta.

Processing EEG Signals Towards the Construction of a User Experience

285

2. Signal Preprocessing. Prior to extracting the characteristics of the signal, an artifact removal procedure was applied. The preprocessing techniques applied to the signal were: • Average elimination and the best linear fit of the signal mean. B = A − Ai

(1)

Where A = the original data, Ai= the data mean C = detrend(B)

(2)

• A Hamming window was applied to the signal. • A Finite Impulse Response (FIR) bandpass filter was applied (1 Hz– 30 Hz). 3. Signal Feature Extraction. In this phase the Fast Fourier Transform (FFT) was applied to the brain signals in order to obtain the signal characteristics. • FFT N ! (j−1)(x−1) C(x) = c(j)WN (3) j=1

Where i = imaginary data, N = Data size. The features extracted from the signal were: • Frequency • Magnitude " P = abs(

• Power

real(Cij )2 + imag(Cij )2 ) H = (Pij )2

• Divided by brain wave (alpha, beta, theta). ⎛ ⎞ α11 . . . α1n ⎜ ⎟ M α = ⎝ ... . . . ... ⎠

(4) (5)

(6)

αm1 . . . αmn

⎞ β11 . . . β1n ⎟ ⎜ M β = ⎝ ... . . . ... ⎠ βm1 . . . βmn ⎞ ⎛ θ11 . . . θ1n ⎟ ⎜ M θ = ⎝ ... . . . ... ⎠ ⎛

θm1 . . . θmn

(7)

(8)

286

I. Carrillo et al.

• Signal Average.

n ! k !

M aij

(9)

n ! k !

M bij

(10)

n ! k !

M tij

(11)

¯ α) M axα = max(M

(12)

¯ β) M axβ = max(M

(13)

¯ θ) M axθ = max(M

(14)

¯α = M

i=1 j=1

¯β = M

i=1 j=1

¯θ = M

i=1 j=1

• Signal maximum value for each band.

• Signal minimum value for each band.

¯ α) M inα = min(M

(15)

¯ β) M inβ = min(M

(16)

¯ θ) M inθ = min(M

(17)

• Standard deviation for each band. ") n )k dsα =

dsβ =

dsθ =

i=1

") n

i=1

") n

i=1

j=1 (M aij

)k

n(k)

j=1 (M bij

)k

n(k)

j=1 (M tij

n(k)

¯ α)2 −M

(18)

¯ β)2 −M

(19)

¯ θ)2 −M

(20)

4. Classification. Finally, the characteristics obtained were used to train a neural network (Fig. 2) which will be used to identify the emotions. A Patternnet neural network was used. Patternnet recognition networks are feed forward networks that can be trained to classify inputs according to target classes. The neural network was trained using the brain signals from pleasant and unpleasant emotions (explained in the next section).

Processing EEG Signals Towards the Construction of a User Experience

287

Fig. 2. The structure of the neural network.

5

Results

Participants Verbal Responses. According to the verbal classification reported by each participant (see Table 1), we obtained that the answers to the selected pictures to provoke pleasant emotions matched 91 % of the times. Additionally, responses to unpleasant pictures agreed 84 % of the times. In both cases, the pictures converge to the categories of the established test. By contrast, responses to the pictures selected to evoke fear agreed 49 % of the times, while for those selected as neutral their responses corresponded 57 % of the times, this means, that the responses in these categories were not reported as expected. For additional results see Carrillo et al. [26]. Given these results, only the pleasant and unpleasant signals brain recorded were used to train and test the neural network. Table 1. Match averages reported by participants. Participants IAPS pictures

P1 P2 P3 P4 P5 P6 P7 P8 Match

Unpleasant (10) 10 Pleasant (10) Neutral (29)

8

7 10

9 10 10

8

8 10

8

8 P1 84 %

7 10 P1 91 %

17 11 17 26 10 14 20 P1 57 %

Fear (10) 8 8 4 7 7 0 0 P1 49 % P: Participant; Match %: Indicates the percent were the verbals answers match with the pictures category

288

I. Carrillo et al.

It is important to highlight that participant P7’s results were removed. P7 was a special case, as she responded that all pictures were pleasant. It was observed that participant P7 was very anxious during the test, which along with a possible missunderstanding during the explanation of the activity could caused her to always provide the same answer. Neural Network Classification. A Patternnet recognition neural network was used. The target data for patternnet recognition networks should consist of vectors of all zero values except for a 1 in element i, where i is the class they are to represent. The neural network was developed using Matlab version R2013b. The data for the network categories (training, validation, test) were randomly selected. The network had 10 input neurons (one for each EEG channel) and two output neurons identifying two emotions: pleasant and unpleasant. After several settings for the neural network (10 to 1000 hidden neurons), the best result was with 253 neurons. The network was trained with 258 inputs obtaining the setting showed in Table 2. Table 2. Neural network elements. Neural network elements Setting Hidden layer neurons

253

Percent training data

80 %

Percent validation data

10 %

Percent ratio data

10 %

Training function

trainscg

Performance function

crossentropy

In order to validate the results obtained with the neural network, a lineal regression was performed with data from each section (training, validation, and test) (see Fig. 3). Figure 4 shows the confusion matrix for the three kinds of data combined (training, testing, and validation). The network outputs are very accurate, as can be seen by the high number of correct responses (derived from neural network outputs vs pictures prior category) in the green squares (upper left and in the middle) and the low number of incorrect responses in the red squares (upper middle and first in second row). The lower right blue squares (bottom rigth) illustrate the overall accuracies (99.61 %). Discussion. From the results obtained from verbal responses it was interesting to observe that for the elderly, the standardized pictures not always evoked the particular emotions expect. For instance, the worst case was for the fear category, only 49 % of the responses indicated that the pictures evoked this emotion, 39 % of responses indicated unpleasant and 10 % of responses were classified as neutral.

Processing EEG Signals Towards the Construction of a User Experience

289

Fig. 3. Lineal regression for trained, validation and test data. It shown that R=1, thats mean that there is a correlation between trained, validation and test data with the output data.

From this group of pictures, picture 59, which shows a skeleton, was the only one that did not evoked fear; 40 % of participants responded with neutral and 30 % responded with unpleasant. For pictures in the neutral category, only 57 % of these were classified as neutral, 40 % of them were reported as pleasant and 3 % were reported as unpleasant. Regarding the pictures in the unpleasant category, it can be observed that over 84 % evoked unpleasant emotions to participants, while 11 % were reported as neutral, 3 % as fear and 2 % as pleasant. Participant P3 indicated that picture 31, which showed a hand surgery, was perceived as pleasant to him as he was a medical doctor, so that it could be concluded that the activities being undertaken by participant P3 as a professional affected his answer. Finally, the best results were for the pictures in the pleasant category, as expected, 91 % of the responses were reported as pleasant, 7 % as neutral and 2 % as fear (see Table 1). The results show that participant P4 expressed that picture 45 evoked fear to her; in this case the picture corresponded to 3 girls smiling, so it could be a coding error by the participant; while for picture 51,

290

I. Carrillo et al.

Fig. 4. Confusion matrix for three kinds of data combined (training, testing, and validation).

which corresponded to an older woman, it was classified as neutral by participants P1 and P6. Considering that the IAPS pictures used have been tested to cause known emotions, then a possible explanation for the wrong results for the fear (49 %) and neutral (56 %) pictures could be that although the pictures have been tested with children and adults, the elderly population leaves the common perception; this difference may be caused by the elderly vision of life, the whole context of the participants’ live, and, of course, the natural decline that participants may have. Given these results to identify fear and neutral pictures by participants, the corresponding brain signals were not used as input to the neural network. Concerning the neural network, after the preprocessing of the brain signals, 258 pre-known signals were used, 129 corresponding to pleasant emotion, and 129 of unpleasant emotion. It is important to mention than the input vector included an average from the 10 EEG channels and not for each neuron separately. An analysis for this configuration must be performed later in order to compare the differences of the results obtained.

6

Conclusions

This paper presents preliminary results of our proposal to determine a set of emotions through brain signal (EEG) records. The results of participant’s verbal responses validated the emotional presence of pleasant and unpleasant emotions, especially when the selected pictures to evoke these emotions were presented. These records where used as input in a Patternnet neural network with 253 neurons hidden layer and an accurate rate of 99.61 % for 258 characteristics of the preprocessed signals.

Processing EEG Signals Towards the Construction of a User Experience

291

Our future work includes performing the analysis using the 10 nodes from the electrodes separately. In addition, we must use the validated neural network to identify pleasant or unpleasant emotions using the brain signals from users of CS applications in order to assess the UX. Acknowledgements. We acknowledge the support of UABC, in the form of the Programa de Servicio Social 212, Proyecto Interno 231, and CONACYT by scholarship number 538130 for the first author. We also acknowledge the elderly participants from Ensenada, B.C., M´exico for their support and participation in the study.

References 1. Buiza, C., Soldatos, J., Petsatodis, T., Geven, A., Etxaniz, A., Tscheligi, M.: HERMES: pervasive computing and cognitive training for ageing well. In: Omatu, S., Rocha, M.P., Bravo, J., Fern´ andez, F., Corchado, E., Bustillo, A., Corchado, J.M. (eds.) IWANN 2009, Part II. LNCS, vol. 5518, pp. 756–763. Springer, Heidelberg (2009) 2. Meza-Kubo, V., Mor´ an, A.L.: UCSA: a design framework for usable cognitive systems for the worried-well. Pers. Ubiquit. Comput. 17(6), 1–11 (2012) 3. Meza-Kubo, V.: Gu´ıas para el dise˜ no de aplicaciones de estimulaci´ on cognitiva utilizables por el adulto mayor. Ph.D. thesis, Universidad Aut´ onoma de Baja California (2012) 4. Arhippainen, L., T¨ ahti, M.: Empirical evaluation of user experience in two adaptive mobile application prototypes. In: Proceedings of the 2nd International Conference on Mobile and Ubiquitous Multimedia (2003) 5. Sohaib, A.T., Qureshi, S., Hagelb¨ ack, J., Hilborn, O., Jerˇci´c, P.: Evaluating classifiers for emotion recognition using EEG. In: Schmorrow, D.D., Fidopiastis, C.M. (eds.) AC 2013. LNCS, vol. 8027, pp. 492–501. Springer, Heidelberg (2013) 6. Kim, M.-K., Kim, M., Eunmi, O., Kim, S.-P.: A review on the computational methods for emotional state estimation from the human EEG. Comput. Math. Methods Med. 2013, 13 (2013) 7. Nasehi, S., Pourghassem, H., Isfahan, I.: An optimal EEG-based emotion recognition algorithm using gabor. WSEAS Trans. Sig. Proc. 3(8), 87–99 (2012) 8. Sourina, O., Liu, Y.: A fractal-based algorithm of emotion recognition from EEG using arousal-valence model. In: Biosignals, pp. 209–214 (2011) 9. Brown, L., Grundlehner, B., Penders, J.: Towards wireless emotional valence detection from EEG. In: 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC, pp. 2188–2191. IEEE (2011) 10. Petrantonakis, P.C., Hadjileontiadis, L.J.: Emotion recognition from EEG using higher order crossings. IEEE Trans. Inf. Technol. Biomed. 14(2), 186–197 (2010) 11. Chai, J., Ge, Y., Liu, Y., Li, W., Zhou, L., Yao, L., Sun, X.: Application of frontal EEG asymmetry to user experience research. In: Harris, D. (ed.) EPCE 2014. LNCS, vol. 8532, pp. 234–243. Springer, Heidelberg (2014) 12. Hakvoort, G., Poel, M., Gurkok, H.: Evaluating user experience with respect to user expectations in brain-computer interface games, pp. 1–4 (2011) 13. Fredrickson, B.L., Losada, M.F.: Positive affect and the complex dynamics of human flourishing. Am. Psychol. 60(7), 678–686 (2005)

292

I. Carrillo et al.

14. Rodriguez, J.-A.P., Linares, V.R., Gonzalez, A.E.M., Guadalupe, L.A.O.: Emociones negativas y su impacto en la salud mental y f´ısica. Suma Psicol´ ogica 16, 85–112 (2009) 15. Mosquera, G., Daniel, S.: Adquisici´ on de se˜ nales electroencefalogr´ aficas para el movimiento de un prototipo de silla de ruedas en un sistema BCI. Ph.D. thesis (2012) 16. Harmon-Jones, E., Gable, P.A., Peterson, C.K.: The role of asymmetric frontal cortical activity in emotion-related phenomena: a review and update. Biol. Psychol. 84(3), 451–462 (2010) 17. Sim´ on, V.: Mindfulness y neurobiolog´ıa. Revista de Psicoterapia 66, 5–30 (2007) 18. Winkler, I., Mark, J., Jager, M., Mihajlovic, V., Tsoneva, T., Winkler, I., Mark, J.: Frontal EEG asymmetry based classification of emotional valence using common spatial patterns. Worls Acad. Sci. Eng. Tech. 45, 373–378 (2010) 19. Cicchino, A.N.B.: T´ecnicas de procesamiento de EEG para detecci´ on de eventos. Postgradofcm. Edu.Ar (2014) 20. Navarro, F.S., Pedro, J., Lapuente, R.: Am´ıgdala, corteza prefrontal y especializaci´ on hemisf´erica en la experiencia y expresi´ on emocional. Universidad de Murcia Servicio de Publicaciones, Murcia (2004) 21. Kostyunina, M.B., Kulikov, M.A.: Frequency characteristics of EEG spectra in the emotions. Neurosci. Behav. Physiol. 26(4), 340–343 (1996) 22. Mikels, J.A., Fredrickson, B.L., Larkin, G.R., Lindberg, C.M., Maglio, S.J., Reuter-Lorenz, P.A.: Emotional category data on images from the international affective picture system. Behav. Res. Methods 37(4), 626–630 (2005) 23. Bradley, M.M., Lang, P.J.: The international affective picture system (IAPS) in the study of emotion and attention, pp. 29–46 (2007) 24. Bertron, A., Petry, M., Bruner, R., Mcmanis, M., Zabaldo, D., Martinet, S., Cuthbert, S., Ray, D., Koller, K., Kolchakian, M., Hayden, S.: International affective picture system (IAPS): technical manual and affective ratings (1997) 25. Delorme, A., Makeig, S.: EEGLAB: an open source toolbox for analysis of singletrial EEG dynamics including independent component analysis. Elsevier 134, 9–21 (2004) 26. Carrillo, I., Meza-Kubo, V., Mor´ an, A.L., Galindo, G., Garc´ıa-Canseco, E.: Emotions identification to measure user experience using brain biometric signals. In: Zhou, J., Salvendy, G. (eds.) ITAP 2015. LNCS, vol. 9193, pp. 15–25. Springer, Heidelberg (2015)