Easy Perception Lab: Evolution, Brain and Virtual

4 downloads 0 Views 553KB Size Report
Reality and through Virtual Reality projections, related to natural stimuli (Plankton 3D and .... All participants gave written informed consent according to Helsinki ...
Lecture Notes in Computer Science – Springer – in press 2014

Easy Perception Lab: Evolution, Brain and Virtual and Augmented Reality in Museum Environment 1

Sara Invitto , Italo Spada2, Dario Turco3, Genuario Belmonte

4

[email protected], [email protected], [email protected], [email protected]

Abstract. Within a route of Education Naturalistic Museum (MAUS), we configured various types of intervention and study related to new technologies and new scientific languages, depending on the objective of learning and involvement. The idea of this work was to increase and to enhance the usability of MAUS Museum through App of Augmented Reality and through Virtual Reality projections, related to natural stimuli (Plankton 3D and Tarbosaurus 3D), to a site storage of exhibits and geo-referencing of the same and all analysis and stimuli validation on the basis of new technologies and on the basis of the of the elements of interaction‟s characteristics. Easy Perception Lab is a project based on Information Technology in which we validated/evaluated the activation produced by stimuli presented in 2D and 3D in MAUS museum, based on evolutionary and neuroaesthetic hypotheses. Keywords: Virtual Reality, Neuroaesthetic, Museum Learning, Evolution.

1 Introduction The use of new Information and Communication Technologies within Science Education is linked to date a series of interactive software that can play at a very high level of technology, making them extremely compelling in their interaction . These software applications are accessible by the user from the learner and have expanded the possibilities for experimentation of categories in which disable people can experience situations in ' protected way ' . From this, MAUS Museum (Museum of Environment, University of Salento) have expressed the need to develop new 1

Department of Biological and Environmental Sciences and Technologies , University of Salento, Italy 2 CETMA Consortium – New Media Area Manager –Mesagne -Italy 3 Agilex S.R.L. Lecce -Italy 4 Department of Biological and Environmental Sciences and Technologies , University of Salento, Italy and MAUS Museum of Environment –Unisalento, Italy

1

Lecture Notes in Computer Science – Springer – in press 2014

educational tools for interactive users. The project aims to build a prototype of learning movie in virtual reality, to define new paradigms of intervention in education. The prototype Easy Perception Lab allow a large set of users (school students of first and second degree and university students ), to test a new approach to the use of a training related to virtual and augmented reality, based on the theories of neuroarchaeology and embodied perception [1-2] , specifically linked to perceptual immersion in an environment of a prehistoric animal. According to these theories, an immersive learning developments greater neural plasticity, memory, and emotional involvement. In Museum the users can move themselves in the environment of prehistoric animal or in plankton environment through the 3D perception, and this can be like a game or like a learning nice moment [3-6]. The work of perceptual sounds and the experimentation through psychophysiological EEG, were developed by the Laboratory of Human Anatomy and Neuroscience, the Virtual Reality (kinect, interaction) were developed by CETMA. Augmented Reality applications had been developed by Agilex srl. All the products were supervised by MAUS. Then, within a route of Education Naturalistic Museum (MAUS), we configured various types of intervention and study related to new technologies and new scientific languages, depending on the objective of learning and involvement. The idea of this work was to increase and enhance the usability of the MAUS through App of Augmented Reality and Virtual Reality projections, related to natural stimuli (Plankton Tarbosaurus 3D and 3D with moments of direct interaction), to a web-site storage of paleontological pictures, images and cards and geo-referencing of the same. All analysis and validation of the stimuli were based on the basis of new technologies and the characteristics of the elements of interaction according to neuroaesthetic [7-8] theory. Neuroaesthetic “is the study of the neural processes that underlie aesthetic behavior” More precisely, “there are forms of interaction with objects that can be called „aesthetic,‟ ” and “the job of neuroaesthetic is to identify these aesthetic functions and to investigate their neurobiological causes”[1]. Researchers who have been prominent in the field combine principles from perceptual psychology, evolutionary biology, neurological deficits, and functional brain anatomy in order to address the evolutionary meaning of aesthetic that may be the essence of perception and of learning in a structure like a Museum. In addition, and related to this topic, important recent developments in brain and cognitive sciences offer new avenues for productive cooperation between archaeology and neuroscience. Thus there is great prospect in the archaeology of mind for developing a systematic cross-disciplinary endeavor to map the common ground between archaeology and neuroscience, frame the new questions, and bridge the diverge analytical levels and scales of time. Neuroarchaeology [2] aims at constructing an analytical bridge between brain and culture by putting material culture, embodiment [9], time and long term change at center stage in the study of mind. In evidence of these theoretical paradigms, aim of this work is to assess, in a Museum of Natural History, how the stimuli that can be exposed, can elicit different cortical levels of activation related on the basis of phylogenetic characteristic of the stimulus, and, where the experience is 'contextualized' and where there is a immersive learning (Environmental learning and Virtual Reality learning) we can modulate cortical activation [10-12] in a direction of a greater arousal. In this study, we propose an experimental protocol in which we administered, through the software E-Prime 2.0 Presentation, a tachistoscopic

Lecture Notes in Computer Science – Springer – in press 2014

presentation of objects, animals, plants, colored screen, fossils and pictures of planktonic elements during an EEG session . Some of these elements were showed in crossmodal way: 2D or 3D vision with sounds (the sounds could be consonant or dissonant with the image that we showed). The experimental study, based on psychophysiological techniques such as Event Related Potentials (ERP) and the Galvanic Skin Reflex, has provided an analysis of the cortical processing of information through the ERP analysis, after a immersive visit in MAUS and, later, as a result of a presentation in 2D and 3D stimuli / archeological finds of the MAUS. We also evaluated how different levels of cross-modal perception (visual and auditory) consonant and dissonant to the stimulus presented MAUS have enabled different levels of attentional involvement [13-14].

2 Method We prepared two steps of experiment: The First experiment with a ergonomic analysis of the stimulus, and the second with the immersive or 3D presentation in Virtual Reality. Our Sample was composed by 24 university student, 12 men and 12 women (mean age 34 years s.d. 11). The sample of volunteers recruited had normal hearing, normal or corrected to normal vision and had a right manual dominance. Subjects recruited had not previous experience of the MAUS Museum; 12 subject were insert in Baseline Condition Study (2D Vision) and 12 in Immersive Environment Condition (3 D, Real Experience). None of them had taken part in previous experiments. All participants gave written informed consent according to Helsinki declaration. Ethic committee (ASL Lecce) authorized the research. 2.1 Behavioral Tools Anagraphic data of the subject (age, sex) 1-10 VAS (Visual Analogue Scale) for the discrimination of Familiarity with the Pictures Reaction Time (recorded through E-Prime 2.0 Presentation). 2.2 Psychophysiological Tools During the images presentation task we recorded an EEG 16 Channels of Brain AMP - Brain Vision Recorder, and Galvanic Skin Reflex (GSR). We Considered in EEG tracks, the ERP (Event Related Potentials) for averaged waves for Reptiles, Maus Objects, Planktonic Elements and Colours, Dinosaurus 3D and Plankton 3 2.3 Conditions of registration The recordings were made through the use of a mounting international standard 16 electrodes / channels: EEG activity was recorded from the channels Fz, Cz, Pz, Fp1, Fp2, F3, F4, F7, F8, C3, C4, T7, T8, O1, O2 and GSR, using the Brain Amp device

Lecture Notes in Computer Science – Springer – in press 2014

with the software Brain Vision Recorder (Products GmbH © 2010 ), during the course of a go-nogo task. The electrode for electro-oculogram (EOG) was applied over the left eye. The band detected at 0.2 to 50 Hz, EEG sampled at 256 Hz for 1000 ms, with 100 ms basic pre-stimulus. Finally, all data were analyzed and filtered through Brain Vision Analyzer software (Brain Products GmbH © 2010).

2.4 First Experiment: The images have been selected through a repertoire of neutral images (colored squares on a light background) and target images (fish, reptiles, mammals and MAUS objects and normal objects, plankton and Tabosaurus). All stimuli had a size of 240 x 210 pixels, and were displayed centrally on a light gray background and to the same level of brightness on the monitor of computer. The oddball task was administered using the software E-prime 2.0, application of software tools Psychology, Inc. The stimuli were 56 total including 10 target called "Mammals"; 10, referred to as "Fish"; 10, called "Reptiles"; 10, called "Objects"; 16 colored screen. The experimental task consisted of 4 trials; each trial, consisting of only one type of target alternating randomly in background-color, had a duration of 6000 seconds, with a stimulus duration of 2000 ms duration and interstimulus 1000 ms. Participants were instructed to stand upright with 75 cm ca between the front edge of the chair and the floor of the computer screen and press a mouse whenever they saw an image on the screen.

Lecture Notes in Computer Science – Springer – in press 2014

Fig.1: MAUS Image/object – Teleosaurus Ancient big Reptile of Jurassic Period (MAUSUnisalento)

Fig.2 Plankton, Acartia in 3D

Lecture Notes in Computer Science – Springer – in press 2014

Fig.3 Tarbosaurus in 3D

2.5 Data Analysis: ANOVA on amplitude and latency, is significant in parietal amplitude F (6,156), p = 0.00 and Parietal latency F (3,190), p = 0.039; and amplitude in the occipital region F (7,069), p = 0.001. From the Post Hoc (Bonferroni Test) emerged on trials corresponding to categories of stimuli (Fish, Reptiles, Mammals, MAUS object and normal object), shows a significant amplitude in the parietal region for the category stimulus-versus Reptiles Fish, Parietal Latency for the category stimulus-versus Reptiles Fish, and a significant amplitude in the occipital region always for the category-Reptiles stimulus versus Mammals. ANOVA on Amplitude and latency, in lateralization shows a significance to the region in Left Parietal amplitude F (3,981), p = 0.035; Right in the parietal amplitude F (3,976), p = 0.035 and for the Right Occipital in amplitude F (4,924), p = 0,019. Post hoc (Bonferroni test), shows a trials corresponding to categories presented can be seen in a significant amplitude in the left parietal region for the category-Reptiles; in width in the region Parietal Right and Occipital Right for the Reptiles category. 2.6 Second Experiment: The objective of our study was to evaluate the perceptual characteristics of museum objects with evolutionary contents and natural sounds in a university museum, and how these characteristics can be fully elicited within the immersive environment an virtual reality. The assignment was a Go no Go task, carried out using a visual and auditory cross modal process. The visual stimuli consisted of two types of 2D images (colored screens and objects present in the museum fig.1) and two types of 3D anaglyph images (Plankton fig.2 and Tarbosaurus fig.3) which were presented with the sounds of the Tarbosaurus or aquatic sounds in the background or no sounds at all. The experimental group performed the task in the museum (13 volunteers, recruited in the university; avg. age 29.2; SD=4.2). The control group performed the task in the university (12 volunteers; avg. age 34; SD =6). 2.7 Data Analysis: N200: We found no significant different in N200 component in baseline condition. In comparison between Baseline condition vs Environmental context we found significant effect on Fp2 Amplitude (p=0,36) and F4 Amplitude (p=0,48) in direction of an increase in negative amplitude in Museum Conditions in all the trials. P300: We found Significant Differences in F4Latency (p=0,025); O1 Amplitude (p=0,027); O2 Amplitude (p=0,006); F7Latency (p=0,022) and PZ in Latency

Lecture Notes in Computer Science – Springer – in press 2014

(p=0,007) and Amplitude (p=0,011).

Analysis MAUS Objects in Baseline and Museum conditions: In N200 ERP we find a significant effect on O1 Amplitude (p= 0,048) and P8 Amplitude (p=0,040) in direction of an increase in positive amplitude in Museum Condition and in 3 D condition (Plankton and Tarbosaurus) In P300 ERP we found a significant effect in F4 Amplitude (p=0,048), in Cz Amplitude (p=0,000) and in Pz Latency (p=0,025). The GLM considered the group, target condition and sound condition and showed a significant effect in ERP (Group α