Tactile Information Flow Ontology - Semantic Scholar

4 downloads 0 Views 315KB Size Report
tactile information generation, flow and perception and also drive tactile interfaces, we need a formal definition of tactile information. .... corpuscles. Merkel cells.
Tactile Information Flow Ontology Eirini V. Myrgioti, Vasileios G. Chouvardas, Amalia N. Miliou, and Miltiadis K. Hatalis Department of Informatics, Aristotle University of Thessaloniki, 541 24, Thessaloniki, Greece e-mail: {emyrgiot, vchou, amiliou, mkh}@csd.auth.gr Abstract - Tactile displays are interfaces that enable users to access information through the human skin using the sense of tactation. In order for the software to better enhance the tactile information generation, flow and perception and also drive tactile interfaces, we need a formal definition of tactile information. In this article, we propose a model for tactile information flow using ontologies. The proposed model analyzes the existing knowledge in tactile information and helps defining the types of information that take part in the tactile information flow. The new model can be used to identify how tactile displays can enhance perception of information and human-computer interaction and assist the development of software for tactile displays. Index Terms - Ontologies, software development, tactile displays, tactile information, UML.

I. INTRODUCTION A tactile display is a human-computer interface that utilizes tactation to present information. Tactile information is created by a tactile display as a set of signals that are conveyed to the skin receptors through the skin. The receptors transmit tactile signals to the cerebral cortex of the brain through neural system in order to form a perception [1]. Software development for tactile displays is not a trivial task, because developers must have thorough knowledge about the skin and the nervous system physiology. For the software to effectively drive tactile displays, a well structured tactile information representation is required. However, different features of a touched object are perceived following various combinations in the skinnerves-brain scheme. As tactile information is modulated by the channel, we must first model its flow. Although there is a lot of research about tactile sensation [2], [3], [4] and tactile information processing [5], [6], [7], a formal definition of tactile information has not yet been developed. This may be due to the fact that tactile information flow is a complex phenomenon, containing many interrelated elements such as different

ICCTA 2007, 1-3 September 2007, Alexandria, Egypt

kinds of skin receptors and several brain areas. The relations between those elements and their properties describe the procedure of generation and processing of tactile information. Ontologies play a critical role in the software development process since they define the basic terms and relations comprising the vocabulary of a topic area as well as the rules for combining terms and relations to define extensions to the vocabulary [8], [9]. The aim of this paper is to provide a formal definition and modeling of tactile information flow using ontologies. This definition will be used to construct a tactile information model for tactile display software development. The paper is organized as follows: Section two presents the methodology, design considerations and modeling concepts for building the tactile information flow ontology. The elements that take part in the tactile information flow are discussed in section three. In section four the UML diagram of the ontology is described. Finally, in section five, conclusions and future research directions are discussed. II. METHODOLOGY Tactile interaction is related to all aspects of touch and this involves not only sensation by the skin but also perception by the brain [2]. There are different kinds of receptors in human skin related to touch, limb and joint movement, and temperature sensing [3], [10]. Tactile perception results from combining inputs from all receptors in a given skin area, as all skin receptors are simultaneously stimulated [2]. The final processing of the tactile signals is done by the brain and specifically in the cerebral cortex as reported by [5], [6], [7]. The ontology developed in our work includes basic concepts in the tactile information processing domain and relations among them. These concepts are defined and

263

TABLE I HUMAN MECHANORECEPTORS AND THEIR CHARACTERISTICS

Stimulation Type Frequency range (most sensitive at) Spatial resolution Receptive field Adaptation Receptors/cm2 (at the fingertip) Physical features to be sensed Time range

Meissner corpuscles Stroking, fluttering 10 – 200 Hz (200-300 Hz) 2 – 8mm Small Rapid (RA I) 140

Mechanoreceptors Merkel cells Pressure 0.4 – 100 Hz (7 Hz) 0.5mm Small Slow (SA I) 70

Ruffini endings Skin stretch

Pacinian corpuscles Vibration

7 Hz

40 – 800 Hz (200- 300 Hz) 2cm Large Rapid (RA II) 21

1cm Large Slow (SA II) 9

Shape, edges

Curvature, Skin stretch, texture Motion direction 5ms to perceive separate stimuli, 20ms to perceive stimuli order

interpreted in a declarative way [8]. The main advantage of the proposed model is that it is reusable through different applications and assists the development of software for tactile displays. For the specification of the tactile information flow ontology the tactile information generation and processing steps are followed. The information produced by a tactile device and conveyed to the skin through the receptors is then transmitted to the brain through nerves in order to integrate perception. In the development of the tactile information flow ontology we consider the skin physiology, the different kinds of skin receptors and their properties. Finally, the tactile information flow ontology takes into account the areas in the brain [5] that are responsible for the tactile information processing and the relations between those brain areas and skin receptors. III. TACTILE INFORMATION FLOW The representation of tactile information, in terms of “tactile display-skin-nerves-brain”, sets the foundations for analyzing tactile information. In the next section, we describe the main classes of the tactile information flow ontology: skin, receptors, nerves and brain. A. Skin The skin is considered as a layer for transmitting energy to the receptors, therefore we consider its mechanical, thermal and electrical characteristics. Its properties vary among different human groups (blind people have thinner skin at the fingertips) and parts of the body. In our case we consider the properties of glabrous skin, specifically at the fingertip.

ICCTA 2007, 1-3 September 2007, Alexandria, Egypt

Texture, roughness

B. Receptors The skin contains different kinds of receptors that are arranged in different depth layers. The receptors that are related to tactile information are organized in four classes that are mechanoreceptors, proprioceptors, thermoreceptors and nociceptors. Mechanoreceptors are located in different layers in the skin and are classified into four categories as rapidly adapting I and II (RAI and RAII) and slowly adapting I and II (SAI and SAII), according to their adaptation speed [3], [4], [11]. Their characteristics are summarized in Table I [3], [4], [10], [11], [12]. Proprioceptors are found in or near joints (Golgi and Ruffini endings). Ruffini endings have shown some activation during both static positioning and limb movement. Golgi tendon organs, sense active and static limb positioning [2]. Muscle spindles signal changes in the length of the muscle which are related to changes in the angles of the joints that the muscles cross [13]. Thermoreceptors sense changes in skin temperature. They contribute (a) to the total sensory information about an object making contact with the skin, and (b) in accessing body heat loss or gain that occurs over a large part of the body surface [14]. Humans recognize four distinct types of thermal sensations (cold, cool, warm and hot), that result from temperature differences between the skin (normal 34°C) and the objects contacting the body. Although it seems that the skin is sensitive to temperature, our receptors cannot measure the exact temperature of the surface but, they rather feel the thermal energy flow. Cold receptors respond to steady-state temperatures of 5-40°C (most active at 25°C), while warm receptors respond to temperatures of 29-45°C (most active at 45 °C) [12]. At temperatures over 45 °C or below 15 °C, pain receptors

264

(nociceptors) are stimulated and humans feel pain rather than warm or cold sensation respectively [14]. Nociceptors are responsible for the pain signals that can damage the skin tissue. They are distinguished in mechanical, that are excited by sharp objects that penetrate, squeeze, or pinch the skin, thermal which respond to extremes of temperature as well as strong mechanical stimuli, and polymodal which respond to a variety of destructive mechanical, thermal, and chemical stimuli [12]. C. Nerves Sensory information is conveyed in separate pathways to the brain thalamus and cerebral cortex by populations of different types of sensory neurons [15]. D. Brain The somatosensory system is able to integrate spatial, temporal, intensive, kinaesthetic and other information about tactile stimuli to create perception of an object [16]. Sensory information is processed in a series of relay regions within the brain. These regions contain maps about the parts of the body and they provide information about different aspects of sensation. The somatic sensory cortex has three major divisions: the primary (S-I) and secondary (S-II) somatosensory cortices and the posterior parietal cortex. Each of them contains areas that are responsible for processing different signals transmitted by the skin. The primary somatic cortex (S-I) [5] contains four areas, commonly referred to as: area 1, area 2, area 3a, and area 3b. Area 1 receives input from RA receptors and senses the object’s size. Area 2 contains a map of mechanoreceptors and it is responsible for sensing the size and shape of objects and more complex features such as direction of motion across the hand, curvature of surfaces and orientation of edges [6]. Area 3a receives input primarily from muscle and joint stretch receptors while area 3b from cutaneous. Area 3b is involved in sensing surface texture. The secondary somatic sensory cortex (S-II), is innervated by neurons from each of the four areas of S-I. Other important somatosensory cortical areas are located in the posterior parietal cortex and they receive outputs from S-I [5]. Area 5 integrates tactile signals from mechanoreceptors in the skin with proprioceptive inputs from the underlying muscles and joints and encodes the shape of an object. Area 7 receives visual as well as tactile and proprioceptive input. Neurons in these areas (and neurons in area 2) are involved in the later stages of

ICCTA 2007, 1-3 September 2007, Alexandria, Egypt

somatosensory processing, detecting more complex features of the object and having larger receptive fields than first-order cortical neurons. The function of the three regions depends on the properties of cortical neurons [5]. The receptive fields of cortical neurons become progressively more complex with each stage of information processing, as they extract more features of a stimulus at each stage. Cortical neurons are defined also by their sensory modality. Furthermore, cortical neurons sensitivity depends on stimulus features and on time response (for example, in areas 3a, 3b is 20 ms after touch).

IV. ONTOLOGY DESIGN An ontology development can be expressed with different formalisms such as Knowledge Interchange Format (KIF) [17], Web Ontology Language (OWL) and the Unified Modelling Language (UML) [18]. UML is a formalism that is generally accepted for object-oriented design. Using UML, an ontology design is represented as a static model, that consists of a class diagram depicting the domain concepts, their properties and their relationships [19], [20]. UML tools provide straightforward software code generation from UML diagrams (Java or C++). Our model (Fig. 1) addresses the different skin receptors, the nerves, the brain areas and their properties that contribute to tactile information processing. Specifically, the four main classes of the ontology are: skin, receptors, nerves and brain. Class “Receptors” has four subclasses that indicate the existing receptors: “Mechanoreceptors”, “Proprioceptors”, “Thermoreceptors” and “Nociceptors”. Each of these subclasses contains several properties. Class “Brain” has three subclasses: “Primary somatic cortex”, “Secondary somatic cortex” and “Posterior parietal cortex”. Generalization, the relation between classes and subclasses, is defined by an arrow. Association is the relation between two classes such as skin and nerves and is indicated by a line in the diagram [19]. V. CONCLUSIONS In the previous sections we have described the construction of an ontology, concerning the elements that take part in the tactile information flow, based on the generation, transmission and processing of tactile signals in the skin, the nerves and the brain. This representation provides a well structured set of terms and relations that

265

Fig. 1. UML diagram of the tactile information flow ontology

can be used for defining tactile information classes and assisting software development. For the description of the ontology we have used UML because it provides a formal way for modelling classes and their relations and also used for direct code generation. This ontology model describes explicitly the following important concepts: the tactile signal generation by the skin receptors, the transmission of this signal through the nerves and the perception of received information by the brain. The representation of these concepts will help in better

ICCTA 2007, 1-3 September 2007, Alexandria, Egypt

understanding of tactile sensation because it defines the possible ways of sensation through touch, and perception of the information that is created by a tactile display. Further research will be directed to the development of a tactile information ontology, based on the present work, which will lead to the construction of classes (in an OO language), and formally represent information about the tangible features of objects (size, shape, texture etc). This class hierarchy will provide the tools for tactile display software development.

266

ACKNOWLEDGEMENT The authors would like to acknowledge the financial support of this work by the Greek General Secretariat for Research and Technology, PENED 2003/5505, “Innovative man-machine interfaces: haptic displays/ arrays and flexible displays”.

REFERENCES [1]

[2]

[3]

[4]

[5]

[6]

[7]

[8]

[9] [10]

[11]

[12]

[13]

V. G. Chouvardas, A. N. Miliou, and M. K. Hatalis, “Tactile display applications: A state of the art survey,” Proceedings of the 2nd Balkan Conference in Informatics, pp. 290-303, November 2005. K. S. Hale, K. M. Stanney, “Deriving Haptic Design Guidelines from Human Physiological, Psychophysical, and Neurological Foundations,” IEEE Computer Graphics and Applications, vol. 24, pp. 33-39, March/April 2004. V.G. Macefield, “Physiological Characteristics of lowthreshold mechanoreceptors in joints, muscle and skin in human subjects,” Clinical and Experimental Pharmacology and Physiology, vol. 32, pp. 135–144, 2005. E. Pasterkamp, “Mechanoreceptors in the glabrous skin of the human hand,” Archives of Physiology and Biochemistry, vol. 107, pp. 338-341, 1999. E. P. Gardner, E. R. Kandel, “Touch,” in Principles of Neural Science, 4th ed., E. R Kandel, J. H. Schwartz, T. M. Jessel, Ed. Elsevier, 2000, pp. 452-472. S. Bohlhalter, C. Fretz, and B. Weder, “Hierarchical versus parallel processing in tactile object recognition,” Brain, vol. 125, pp. 2537-2548, 2002. Y. Hlushchuk, R. Hari, “Transient Suppression of Ipsilateral Primary Somatosensory Cortex during Tactile Finger Stimulation,” The Journal of Neuroscience, vol. 26, no. 21, pp. 5819-5824, May 2006. V. Devedzic, “Understanding Ontological Engineering,” Communications of the ACM, vol. 45, no. 4, pp. 136-144, 2002. A. Gomez-Perez, “Ontological Engineering: A state of the art,” Expert Update, pp. 33-42, 2003. R. S. Johansson, “Tactile sensibility in the human hand: Receptive field characteristics of mechanoreceptive units in the glabrous skin area,” Journal of Physiology, vol. 281, pp. 101-123, 1978. S. J. Bolanowski, G. A. Gescheider, R. T. Verillo, and C. M. Checkosky, “Four channels mediate the mechanical aspects of touch,” Journal of Acoustical Society of America, vol. 84, no. 5, pp. 1680-1694, November 1988. E. P. Gardner, J. H. Martin, T. M. Jessel, “The bodily senses,” in Principles of Neural Science, 4th ed., E. R Kandel, J. H. Schwartz, T. M. Jessel, Ed. Elsevier, 2000, pp. 432-451. K. Pearson, J. Gordon, “Spinal Reflexes,” in Principles of Neural Science, 4th ed., E. R Kandel, J. H. Schwartz, T. M. Jessel, Ed. Elsevier, 2000, pp. 714-737.

ICCTA 2007, 1-3 September 2007, Alexandria, Egypt

[14] I. Darian-Smith, and K. O. Johnson, “Thermal sensibility and Thermoreceptors,” The Journal of Investigative Dermatology, vol. 69, pp. 146-153, 1977. [15] O. Guiba-Tziampiri, “Human Physiology,” Zygos Editions, 2004. [16] G. H. Recanzone, M. M. Merzenich, and C. E. Schreiner, “Changes in the Distributed Temporal Response Properties of SI Cortical Neurons Reflect Improvements in Performance on a Temporally Based Tactile Discrimination Task,” Journal of Neurophysiology, vol. 67, no. 5, pp. 1071-1091, 1992. [17] T. R. Gruber, “Toward principles for the design of ontologies used for knowledge sharing,” International Journal Human-Computer Studies, vol. 43, pp. 907-928, 1993. [18] P. A. Kogut, S. Cranefield, L. Hart, M. Dutra, K. Baclawski, M. M. Kokar, and J. E. Smith, “UML for Ontology Development,” The Knowledge Engineering Review, vol. 17, no.1, pp. 61–64, 2002. [19] S. Cranefield, and M. Purvis “UML as an ontology modeling language,” Proceedings of the Workshop on Intelligent Information Integration, 16th International Joint Conference on AI, 1999. [20] N. F. Noy and D. L. McGuiness “Ontology Development 101:A Guide to Creating your first ontology,” Knowledge Systems Laboratory, Stanford University, 2001. http://www.ksl.stanford.edu/people/dlm/papers/ontology101 /ontology101-noy-mcguinness.html

267

Suggest Documents