Personal Context Extractor with Multiple Sensors on a Cell Phone Toshiki Iso, Norihiro Kawasaki and Shoji Kurakake Network Laboratories NTT DoCoMo, Inc. 3-5 Hikari-no-oka, Yokosuka, Kanagawa, 239-8536 Japan Email:
[email protected] Abstract— Demand is growing for context-aware based mobile services such as presence service. In response, we introduce ”PerContEx”, a system that mounts multiple sensors on a cell phone to extract personal context. Compared to related works, its benefit is that the user does not need to wear any sensors on his body. This is because all sensors are mounted on the cell phone, and the algorithm to extract user context is robust with regard to sensor position. We introduce the architecture of PerContEx in terms of its hardware and software. keywords - context, sensors, cell phone, presence service, acceleration sensor, mobile
I. INTRODUCTION Cell phones now offer richer functions because of smaller and more sophisticated sensors, and are now not only a communication tool but also IC-card-based wallets (Felica Ktai [1], NTT DoCoMo, Japan). Therefore, they are becoming a necessary living item in Japan. Services based on user context such as presence services [2] and service navigation [3] have recently been initiated. D. Siewiorek et al. proposed Sensay [4] as a presence service system [5], [6], [7], [8] that is based on handheld devices and many sensors. Their system is based on the assumption that the user wears many sensors on predetermined spots of his body. Unfortunately, it is not really practical because users object to having to wear these sensors continuously. Our solution is PerContEX (Personal Context Extractor), a system that extracts user context without forcing the user to wear any sensors. It is based on the idea of mounting the sensors on the cell phone. It uses an algorithm that is robust with regard to sensor positions. The algorithm can detect a specific signal pattern without requiring the sensors to be worn at predetermined spots. This is because we can obtain eigen value equations by solving the optimization problem on the similarities between the observed data and the ideal data converted by coordinate transformation. Therefore, in order to detect a specific pattern, all that is basically needed is to solve the eigen value equation with the optimal solutions searching for all sensor positions. The simplicity of this approach is a strong merit in achieving real time processing. Section 2 explains the concept of PerContEx, and Section 3 details the system’s algorithm for detecting a specific signal pattern. We then show some experimental results. Finally, we draw several conclusions and describe future works.
II. P ER C ONT E X A. System Architecture PerContEx is composed of multiple sensors mounted on a cell phone, a data transmitter, and a context analysis server (Fig. 1). The sensors collect physical, environmental, and biological information as described below. The data transmitter send this data to the context server in real time. The context server can extract the user’s context from the data collected and send a message describing the context to one or more services based on the user’s security policy. B. Sensors Most recent cell phones in Japan are the folding type, so the sensors are mounted as follows (Fig. 2,3). ¯ ¯ ¯ ¯ ¯ ¯ ¯
3-axis acceleration sensors are mounted on the bottom and top of the cell phone. 3-axis angular rate (Gyro) sensor is mounted on the bottom of the cell phone. Two microphones are mounted on the top-edge of the cell phone. An infrared sensor and an illumination sensor are allocated on top-edge of a cell phone. Temperature sensors are mounted on the bottom and top of the cell phone. Pressure sensors are mounted on both sides and the bottom of the cell phone. A skin resistance sensor is mounted on the bottom of the cell phone.
Illuminance sensor Directional microphone Infrared sensor
Skin resistance sensor Pressure sensors 圧力センサ
Sphygmograph sensor
Fig. 2.
Multi Sensors mounted on a Cell Phone (1)
Sensory Data Receiver
Presence Information Conversion (includes Security Management)
Sensory Data Analysis User Context Extraction
Presence Information Transmitter
Specific Signal Data Extraction
User Context ⇔Presence Information DB
Specific Signal Data ⇔User Context DB
User Context Management Server
Presence Information Receiver
Mobile Network Sensory Data Detection
Sensory Data Transmitter User X has a Cell Phone with multiple sensors
User A
It is possible to extract user context without any sensors on user’s body
When using mobile services (phone, E-mail, Internet)
When moving
Fig. 1.
PerContEx (Personal Context Extractor) System
Acceleration sensor Skin resistance sensor Tilt sensor (Compass) Acceleration sensor
Temperature sensor
Fig. 3.
Pressure sensor
Temperature sensor
2) Environmental User Context Based on the two microphones, the acceleration sensors and the infrared sensor, we can detect whether the user is surrounded by people because the infrared sensor can detect the presence of people. 3) Biological User Context Recent research [9] indicates that it is possible to detect human emotion at the central nerve level from sphygmograph data by chaos analysis. The output of the sphygmograph sensor can be processed to determine whether the user is concentrating or excited situation.
Multi Sensors mounted on a Cell Phone (2)
III. A LGORITHM FOR D ETECTING S PECIFIC S IGNAL PATTERNS : ROBUST TO SENSOR POSITION ¯
A sphygmograph sensor is mounted on one side of the cell phone.
C. User Context Identified 1) Physical User Context Data from the two 3-axis acceleration sensors and the 3-axis angular rate sensor indicate the user’s physical action state such as running, walking, or sitting. Details of the algorithm are described in the next section. The three pressure sensors can detect the user’s key-typing patterns. Thus, we can know what services they use such as E-mail or the Internet browsing or taking a photograph. The illumination data indicates whether the cell phone is being held in the hand or placed in a bag or pocket.
The proposed algorithm can detect specific signal patterns independent of sensor position. This eliminates the need for the user to wear the sensors. The algorithm is intended for sensors that have two or more axis. The 3-axis acceleration sensor is taken as an example for explaining the algorithm. Fig. 4 shows a model of the 3-axis acceleration sensor. aas the absolute coordinate system and We define as the coordinate system on the acceleration sensor. We denote and as the acceleration data on at time the absolute coordinate system and on the coordinate system, respectively. As the acceleration sensor detects only translation motion, by using noise and , we can represent the relation between rotating matrix acceleration data on the absolute coordinate system and that
on the coordinate system as below;
(1)
where
(2) (3) , and are the rotating matrixes about the axis, axis and respectively. This leads to the following
formula.
(4)
Next, we assume that specific signal patterns are generated over finite periods of time . We also assume that no rotation movement occurs during , that is, for example, the user does not strongly twist the cell phone in his hand while walking by or running. Thus, we can replace rotating matrix which is constant during .
and represent scale and shift parameters, respectively. The following formula is yielded by wavelet transform.
¾ Ð
Next, in order to detect feature vectors of specific signal patterns, we use a wavelet transform as an eigen-function expansion. This is because specific signal patterns are basically localized patterns in finite periods of time. Then, motherwavlet functions are dependent on the user’s activity.
Ð
Ð
¾ ¾
Ó
Ó
¾Ð
Ó
x
y
Coordinate system on an acceleration sensor
(7)
Ô
where represents the combination number of parameters and . To optimize the extraction, we use Lagrange’s method with condition of matrix ’s elements. Next, we define
Accelerationsensors sensors Acceleration detectonly onlytranslation translation detect motion motion
(6)
z
Ô
As a preliminary step, we can obtain specific signal patterns under the situation of known positions of acceleration sensors by using wavelet analysis. This is because the specific signal patterns that we want to detect are common user activities such as running and walking ,and are easily collected from the acceleration sensors. Thus, we assume that we have kinds of specific signal already detected and stored Ô as the k-th specific signal patterns, we define
Ô is white noise, we use the pattern. Assuming that noise below normalized cross-correlation as a criteria to extract the appropriate specific signal pattern from the acceleration data observed.
(5)
Ô
(8)
az z
ax
Absolute coordinate system
aZ tat Ro
aX
in
ran gT
t + ∆t
t − ∆t
β (t )
aX
α (t )
Fig. 4.
y
The conditions for the optimal estimates of are
γ (t ) aZ
aY
t
x
ay
rm s fo
aY
Two Type Coodinate Systems
¾ ¾ Ð
Ð
(9) (10)
This yields simultaneous equations as follows:
(a) Calmness
(b) Stress
(11)
(c) Excitement Fig. 6.
Sphygmogram under some situations
where
ÙÚ
ÔÑ
ÓÒ
ÙÚ
ÔÑ
ÔÒ
(12) (a) Calmness
Therefore, by solving (11) we do not need to locate the optimal solutions for all conditions, we can obtain the op timal as the candidate representing the best condition for the k-th specific signal pattern as below;
¾ !"# Ð
(13) where must satisfy (4). The same approach is applied to all other specific signal patterns, and we find the minimum condition of ¾ Ð .
$ ¾ !"# (14) To summarize this section, we only have to solve (number Ð
of specific signal patterns) eigenvalue problems and find the optimal patterns that satisfy the conditions of (11), (7) and (8). This algorithm enables PerContEx to extract the user’s context independent of cell phone position. IV. BASIC E XPERIMENTS A. Experiment description
We perform a basic validation test of PerContEx under two scenarios described below (the data transmitter was not implemented for these trials); ¯ Action scenario for physical and environmental context In this scenario, a user is moving and stopping to do mobile communication. 1) Walking around the office room. 2) Stopping and putting the cell phone in the user’s pocket. 3) Moving from the office room to a quiet area with the cell phone in his pocket. 4) Stopping and typing on the cell phone’s key pad.
Fig. 7.
(b) Stress
(c) Excitement
Chaos Analysis for Sphygmograms
5) Moving from the quiet area to the office room with the cell phone in his pocket. ¯ Action scenario for emotional context In this scenario, a user is sitting on a chair and accessing Internet sites. 1) Relaxing with closed eyes for a few minutes (at this time, the cell phone is held in the user’s hand). 2) Using the Internet for a short time to identify the optimal route for a business trip. 3) Relaxing with closed eyes for a few minutes (at this time, the cell phone is held in the user’s hand). 4) Browsing the user’s favorite sites B. Results 1) Extracting physical and environmental context: Fig. 5 shows time-serious data output by the sensors. Walking and putting the phone in his pocket as physical context could be detected based on acceleration data. Also, pressure sensors and the skin resistance sensor could detect key-typing. Then, illumination data and infrared could detect environmental context such as office room, access aisle, quiet areas. 2) Extracting biological user context: Fig. 6 and 7 show sphygmograms and chaos attractors, respectively, for the user states of calmness, stress, and excitement. In both figures, these situations can be easily distinguished. In particular, the differences between the chaos attractor patterns are quite clear, so these patterns suggest the possibility of extracting the user’s low-level emotion such as concentration or excitement.
1) 2)
Illumination
3)
4)
1) 2)
5)
4)
5)
Sensor
Skin Resistance
Infrared
3)
Sensor
Fig. 5.
We built a cell phone with multiple sensors that liberates the user from having to wear sensors for accessing contectbased services. We also proposed an algorithm to extract user context that is robust to sensor position. Initial tests showed that PerContEx could extract primitive user context. We are currently conducting more extensive experiments and deeper analysis. In this conference, we will talk about the experimental results and provide a evaluation of the domain of applicability. ACKNOWLEDGMENT The authors would like to thank Mr. Kazuo Imai, executive manager of Network Laboratories in NTT DoCoMo,Inc. for his continued support and encouragment.
[1] http://www.nttdocomo.com/home.html [2] http://www.mediateam.oulu.fi/projects/capnet/
Action
Holding
Place
1)
Walking
In Hand
Office Room
2)
Putting in
In Hand
Access Aisle
3)
Walking
Pocket
Quiet Area
4)
Key-type
In Hand
Office Room
5)
Walking
In Hand
Office Room
time-serious data of multi sensors
V. C ONCLUSION
R EFERENCES
Sensor
[3] T. Naganuma, S. Kurakake, A task oriented approach to service retrieval in mobile computing environment, In Proceedings of IASTED International Conference on Artificial Intelligence and Applications, 2005. [4] D. Siewiorek, A. Smailagic, J Furukawa, A. Krause, N. Moraveji, K. Reiger, J. Shaffer and F. L. Wong SenSay: A Context-Aware Mobile Phone, 7th IEEE ISWC, pp.248-257, 2003. [5] A. Krause, D.P. Siewiorek, A. Smailagic and J. Farringdon, Unsupervised, Dynamic Identification of Physiological and Activity Context in Wearable Computing, 7th IEEE ISWC, pp.88-17, 2003. [6] K.V. Laerhoven, A. Schmidt, H.W. Gellersen, Limitations of Multi-Sensor Context-Aware Clothing, Special Issue on Wearable Computers, Journal on Personal and Ubiquitous Computing, 7(3) , 2003. [7] N. Kern, S. Antifakos, B. Schiele, and A. Schwaninger, A model for human interruptability: experimental evaluation and automatic estimation from wearable sensors, 8th IEEE ISWC, 2004. [8] B. Clarkson, K. Mase, and A. Pentland, Recognizing User’s Context from Wearable Sensor’s: Baseline System, Vismod TR Vol.519, March 4, 2000. [9] T. Miao, T. Shimizu, S, Miyake, M. Hashimoto, O. Shimoyama, Alteractions of Complexity Measures during Subsidiary Task in Multi-attribute Operations, Human Interface Symposium, pp.791-pp.792, 2003.