International Global Navigation Satellite Systems Society IGNSS Symposium 2009 Holiday Inn, Surfers Paradise, Qld, Australia 1 - 3 December, 2009
Experiments Utilizing Data Glove and HighPerformance INS Devices in an Immersive Virtual Mining Environment Tomasz P Bednarz 3D Visualisation Software Engineer / CSIRO Exploration and Mining / Australia Phone: +61 7 3327 4706, Fax: +61 7 3327 4566, Email:
[email protected]
Con Caris 3D Visualisation Coordinator / CSIRO Exploration and Mining / Australia Phone: +61 7 3327 4568, Fax: +61 7 3327 4566, Email:
[email protected]
Chris Wesner 3D Application Programmer / CSIRO Exploration and Mining / Australia Phone: +61 7 3327 4109, Fax: +61 7 3327 4566, Email:
[email protected]
Peter B Reid Research Engineer / CSIRO Exploration and Mining / Australia Phone: +61 7 3327 4704, Fax: +61 7 3327 4566, Email:
[email protected]
Gianluca Falco PhD Candidate / Politecnico di Torino / Italy Phone: +11 5646666, Fax: +11 5646329, Email:
[email protected]
Garry Einicke Senior Research Engineer / CSIRO Exploration and Mining / Australia Phone: +61 7 3327 4615, Fax: +61 7 3327 4566, Email:
[email protected]
John Malos Research Physicist / CSIRO Exploration and Mining / Australia Phone: +61 7 3327 4114, Fax: +61 7 3327 4566, Email:
[email protected]
ABSTRACT The present work concerns immersive virtual reality experiments for the mining industry. The experiments demonstrate efficacy of devices used for teleoperation of mining equipment. The visualisation is carried out using Unity3D 2.5 multiplatform game development tool that communicates with a .Net socket server allowing continuous data feeding from: (a) 5DT Data Glove Ultra that measures finger flexures; (b) MicroStrain 3DM-GX2 high performance orientation sensor that provides information about acceleration and rotational attributes such as pitch, roll and yaw. The user is placed at the focal point of 4-m dome and experiences being part of mining environment. Users also have the ability to navigate and manipulate the fields of view using orientation data and behaviour of the underground mining equipment using six different distinctive hand gestures. In the future, the same techniques can be applied to monitor real mining environments, in which the mining equipment is geo-referenced using the same IMU devices and surrounded by other sensor networks. KEYWORDS: IMU, Data Glove, Immersive Virtual Reality, Dome.
1. INTRODUCTION The potential of immersive virtual reality (IVR) technology as a major technical advance for supporting teleoperation scenarios has been already well recognized by multiple industries [1]. One of its unique advantages is the capability to allow operators to visualize abstract concepts, to visit environments and to interact with objects or events that distance, time, scale or safety factors make it difficult or impossible to access. In addition, teleoperation scenarios offer beneficial safety advantages compared to physical presence of the people in harmful environments. Therefore, early investigations of possible technologies that can be used in such scenarios are very beneficial. The present paper focuses mainly on application of IMU/GPU devices for tracking mine surface equipment using an improved loosely coupled process. The real world coordinates are then passed to a virtual reality data server to construct a calculated virtual mine representation. Input devices such as data glove and iPhone are used to interact with the camera viewports and objects, and eventually to send specific safety commands to running equipment if where required. It is also important to realise that the same IVR can be used for training applications. Regian, Shebilske, and Monk [2] reported on empirical research that explored the instructional potential of immersive VR as an interface for simulation-based training. According to these researchers, VR may hold promise for simulation-based training, as the interface preserves both the visual-spatial characteristics of the simulated world, as well as the linkage between motor actions of the student and resulting simulated effects. VR offers the possibility of presenting both small-scale (can be viewed from a single vantage point at a single instant) and large-scale (experienced across time) [3] spatial information in a three-dimensional format, eliminating the need for users to translate between different visual modalities. The Cell Biology Project [4] serves as an example of a study focusing on the impact of VR defined in terms of immersion, natural interaction (via hand trackers) and interactivity on the effectiveness of informal education (i.e., self-directed, unstructured learning experiences). The application was developed in three formats (immersive VR, desktop VR and video-tape viewing) to compare the various visual modalities. Statistics of the modalities highlighted that immersive users scored higher on post-testing of symbolic and graphic retention and were much stronger in terms of level of engagement. However, the non-immersive groups were able to retain more cognitive information (at least for the short term). Adamo-Villiani et al. [5] have presented their development of immersive 3D learning environments designed to increase core mathematical skills in deaf children. Their method taught mathematical concepts and sign language through the user’s interaction with fantasy 3D signers and with the virtual environments. Bell and Fogler [6] suggest that VR is a good medium for presenting 3D objects and relationships and for illustrating concepts that have been covered elsewhere. However, they did not find it an appropriate medium for getting across factual information nor as a substitute for traditional educational methods. Yet, when used properly, it can augment traditional methods and lead to improved long-term retention of material. These benefits are also directly applicable to visually oriented students, helping them to more fully engage with the learning process [7]. Different technologies exist to determine the position of a vehicle, out of which two are used most commonly [8, 9]. The first is the Global Positioning System (GPS) which relies on the radio-frequency (RF) signals from satellites in space which have known locations. The second technology is an Inertial Navigation System (INS), which together with Dead Reckoning calculations, provides dynamic position information. The accuracy required for a land vehicle generally varies with the application. For instance, the acceptable accuracy level for an autonomous car is 5-10 meters but for tracking machines in mining sites greater precision (sub meter) is required [10]. The advantage of using GPS is its ability to provide absolute navigation information. Although the solution provided by GPS is sufficiently accurate (especially when used in differential mode, i.e. DGPS), it is unable to fulfil the requirements of continuity and reliability in many situations. Being a
satellite-based navigation system, GPS requires line of sight (LOS) between the receiver antenna and the satellites and thus is unsuitable for underground or covered use. Signal interruption is a primary factor affecting the continuity and reliability of the navigation solution from GPS [10]. The issue of continuity is addressed to some extent by High Sensitivity GPS receivers (HSGPS) which are able to track low strength signals by using longer detection integration times and data wipe-off methods [11, 12]. However the use of such measurements in degraded signal environments can be detrimental to the navigation solution if measurement faults are not identified and understood. In general, GPS cannot be solely used for navigation and risk factors escalate in harsh or highly variable operating environments. Unlike GPS, an INS is a self-contained navigation system which provides independent position and velocity information. The advantage of INS over GPS is its independence from external electromagnetic signals and its ability to operate in all environments. This allows an INS to provide a continuous navigation solution with excellent short term accuracy [13]. However, INS technology suffers from time-dependent error growth causing a drift in the solution and ultimately compromising the long term accuracy of the system. The drift in a high quality inertial device is small and can fulfil the accuracy requirement in land application for longer periods. Unfortunately, there are two main limitations for their use in general applications: prohibitive cost as precision increases and governmental regulation restricting acceptable uses. Recently with advance in Micro Electro-Mechanical System (MEMS) technology, low cost MEMSbased inertial sensors are available. For example, in this work, we have used a 3DM-GX2 MicroStrain low-cost IMU. The main drawback of this device, due to the relative lack of maturity of this technology, is that the performance of these sensors is limited [14] resulting in the navigation solution degrading rapidly in the absence of an aiding source. The error of even today’s most accurate MEMS IMU would become unacceptable after few minutes (km error after 10 minutes). It therefore becomes necessary to provide an INS with regular updates in order to bound the errors to an acceptable level. The powerful synergy between the GPS and INS makes the combination of these two navigation technologies a viable position option. GPS, when combined with MEMS inertial devices can restrict their error growth over time and allows for online estimation of the sensor errors while the inertial devices can bridge the position estimates when there is no GPS signal reception. Ultimately, a navigation solution derived from a GPS/INS system confers more benefits than either standalone solution and this avenue has been well researched [9, 20]. Different integration strategies have been developed and tested across different grades of IMUs. Typically three main strategies are used; ultra-tight (or deep) integration, loose integration and tight integration Deep integration is performed at the hardware level and therefore is practical for implementation by OEM’s or through software receivers [15]. Exposing the GPS’s software interface enables access and control to core parts of a GPS receiver, such as the acquisition and tracking stages commonly inaccessible to hardware only GPS modules. As a consequence a GPS/INS at this level improves the robustness of the GPS receiver tracking loop under normal vehicle dynamics, improving the accuracy of positioning performance compared to other integration implementations [9, 16]. The other two integration strategies, loosely and tightly coupled, are used more commonly [9, 17, 19]. The first method uses GPS position and velocity measurements in a Kalman filter that models INS error dynamics. The second approach combines GPS estimates of pseudo-ranges and Doppler data (determined by using satellite ephemeris data) with calculated inertia estimates within a Kalman filter. In this paper we have adopted the loosely coupled approach utilising a Sirf-JP13 GPS receiver and a 3DM-GX2 INS.
2. IMMERSIVE VIRTUAL REALITY SYSTEM Figure 1 shows the logical flow chart of the existing system for our human-computer interaction experiments in teleoperated surface mine environment. The user (on the left) is the first element in the diagram interacting with the system via two input devices: a 5DT Data Glove 5 Ultra (Fifth Dimension Technologies) and an iPhone 3G (Apple Inc).
Figure 1: Logical flow chart of the system.
The data glove is connected wirelessly to a PC. The fingers blending/flexures are reported individually as integer numbers depending on the intermediate and proximal phalanges of each digit. Prior to running the experiments, the glove is calibrated using software that estimates the minimum and maximum flexion and extension digits for each sensor, and then calculates the normalized floating point numbers from 0 to 1.0, i.e., from maximum extension to maximum flexion. Therefore, utilising simple logic statements across all digit inputs, hand gestures can easily be implemented. The iPhone is utilised primarily as a programmable interactive visual touch screen, and is used to select different menu items; for instance to change the users view. For convenience, the user can attach the two sensing devices (data glove and touch pad) to the same hand/wrist for interaction inputs (gestures and orientation). A small hand-gesture icon, displayed next to the text description on the iPhone, is used to highlight a specific menu item so the user always knows which gesture to make in order to select that specific item. In the same way, the user can select a specific object with the data glove to send commands to the VR engine and/or downstream components. The coupled IMU/GPS system detailed in section 3 is used to geo-reference the location of a specific mine equipment (for instance, loading tracks, trains, etc.). The estimated locations are sent to the data server, which also collates data from the data glove and the iPhone. Processed data is then passed onto the VR sub-system by low level socket connections. The VR system utilised was Unity3D 2.5, a multiplatform game development tool, which displayed the user interface, the environment and equipment representations, as well as processed the interactions transmitted from the input devices. The final output is displayed in an immersive 3D environment on a four metre hemispherical dome screen using two Christie HD3 professional projectors. The projectors are mounted on the ceiling 1.5 metres apart and are located three metres from the focal point of the screen. Data and user interface calibration/interaction examples have been previously described in our paper [1].
3. LOOSELY COUPLED – IMU/GPS INTEGRATION Whilst GPS and INS integration can occur via various approaches as highlighted earlier, for the remainder of this paper we shall limit our attention to the performance of loosely-coupled (LC) algorithm only. Integration typically occurs via one of two strategies, either an open loop (feed-forward) or a close loop (feed-back) approach. In the open loop implementation, the INS mechanisation operates independently without being aware of the existence of an estimator such as a Kalman filter. The Kalman filter estimates the errors in the mechanisation-derived navigation information, calculates corrections, and outputs updated values. The corrected parameters and estimated sensor biases and drifts are not sent back to the navigation processor. Without feedback, the mechanisation error grows rapidly, and therefore this kind of approach is valid for high quality INS sensors only that propagate relatively small errors. However in our case, we employed a low cost MEMS IMU which generates large errors in a short time, and thus a different approach was required. An error compensation scheme was selected, and a closed loop integration scheme adopted. This feedback approach (as depicted in Figure 2) corrects the raw sensor output and other mechanisation parameters using the error estimates obtained from the Kalman filter. In this way, the mechanisation propagates small errors in order to maintain a good accuracy in the final user’s position and velocity estimations. bg
IMU
ba
r n vn n
Error Estimates Position
b
v
b
Mechanization Equations
r
n INS
+
GPS
r Receiver
n GPS
v
n INS
r n v n v
r
n INS
v
n INS
Velocity
+
Attitude
error estimates
Kalman Filter
n GPS
Kalman Filter
Figure 2: A closed-loop Loosely Coupled GPS/INS integration scheme.
For the mathematical equations that describe the loosely coupled integration of GPS/INS in a Kalman filter, we point the reader to the following papers that provide detailed analysis of the topic [17] – [18]. Briefly, independent position estimates are calculated within a GPS receiver and are optionally filtered. This output is then used periodically as input to an INS filter. A second Kalman filter uses the differences between GPS-derived vector estimates and the INS-derived vectors as a means to obtain error estimates. A typical INS Kalman filter generally consists of nine navigation error states, including three positions n n n r , three velocities v and three attitude error states (see [17] – [18]). In this work we are interested in showing a possible application of a GPS/INS system that can be used for controlling objects in a virtual reality scenario, thus we tested a Loosely-Coupled algorithm along a surveyed path as shown in Figure 3.
Figure 3: Test track view and 3D plot.
Performance of a LC algorithm is shown in Figure 4.
(a)
(b)
Figure 4: Loosely-Coupled algorithm: Position compared with INS solutions only (a) and with GPS only (b)
As evident in Figure 4, the INS-only solution produced unacceptable errors in position estimation, exceeding 3 km after 6 minutes. By exploiting the GPS information in a LC the error is reduced significantly and follows the GPS solution, and this is also true for velocity as shown in Figure 5. Euler angles are presented in Figure 6.
(a)
(b)
Figure 5: Loosely-Coupled algorithm: Velocity compared with INS solutions only (a) and with GPS one (b)
Figure 6: Loosely-Coupled algorithm: Euler angles (Yaw, Pitch, Roll) estimation.
4. TELEOPERATED SCENARIO The main goal of the IVR experiment is to employ the apparatus and methods described previously to create a realistic and intuitive teleoperation experience. The IVR arrangement presents a 3D mine engineering environment and responds to inputs and gestures provided by the user. The system is immersive in that the 3D scene fills the user's horizontal and vertical peripheral vision whilst enabling the user to fully interact with the viewed content. The user can change the camera view including pan, tilt and zoom as well as explore object properties such as velocities, orientation, location and tonnage. In the present experiment, the user stands in front of a four metre hemispherical dome which is seen in Figure 7(a). The user's field of view is restricted to the pictures and animations available on the screen. This enhances the virtual situational awareness and reduces possible distractions when compared to viewing of the same virtual world on the monitor screen. Thus when the field of view is only partially covered by the content, the user can easily observe the motion changes in the background not belonging to the visual system. In our system, the user can be presented with two modes: view mode and interactive mode. The view mode is a safe/lock mode, i.e., the user can only view the content of the virtual world and cannot interact with the system. This is essential for instance when changing poses or talking animatedly using the gloved hand. In the interactive mode, the user interacts directly with the objects and depending upon underlying code and gesture recognition, can send messages to downstream components. Within the framework there is the ability to record and playback experiences. Figure 7(b) shows a snapshot from a 3D world. The cameras paths and views are usually calculated based on locations and orientations returned by IMU/GPS coupled system in the way that the picked objects are always correctly viewed depending on pre-chosen viewing orientations. The geo-referenced locations of the objects are stored in a database for replaying scenarios or for referencing purposes.
(a)
(b)
Figure 7: Inside the virtual world, (a) physical system, (b) immersive virtual reality scenario.
5. SUMMARY The advantage of the loosely coupled integration strategy is that the dimension of the state vectors (in the GPS-only and INS-only filters) are generally of smaller dimension than in the tight integration case .This translates into faster processing times which is required by a system that also integrates infromation from a data glove. That is, the choice of a Loosely-Coupled integration makes computational effort required lower and thereby more suitable for real time implementation.
Employing these systems within an immersive virtual reality allowed us to build prototypes for accurate real-time teleoperation scenarios within industrial applications. Importantly, this IVR system can also provide an "enhanced" teaching medium. The IVR environment enables a user to physically interact with various 3D objects and behaviours which in turn reinforces the idea that doing leads to efficient learning. Therefore, the same technology can be used for both training people how to operate mining equipment using various interactive devices, and be a part of real teleoperated mine sites which is/will be part of industrial control rooms. CSIRO has already developed VR systems to remotely monitor underground mining equipment (LASC Longwall Automationhttp://www.longwallautomation.com/). Collaborative industry projects are currently underway that will see CSIRO leverage IVR technology to demonstrate that teleoperation using VR can be a reality.
REFERENCES [1] Bednarz, T.P., Caris, C., Dranga, O. (2009) “Human-Computer Interaction experiments in an immersive virtual reality environment for e-learning applications”, AaeE Conference, Adelaide, December 2009. [2] Regian, J.W., Schebilske, W.L., & Monk, J.M. (1992). “Virtual reality: An instructional medium for visual-spatial tasks”. Journal of Communication, 42(4), pp. 136-149. [3] Siegal, A.W., (1981). “The externalization of cognitive maps by children and adults: In search of way to ask better questions”. In I. Liben, A. Patterson, & N. Newcombe (Eds.), Spatial representation and behaviour across the life span , New York: Academic Press, pp.163-189. [4] Gay, E. (1994). Is Virtual Reality a Good Teaching Tool? Virtual Reality Special Report, Winter, pp. 51-59. [5] Adamo-Villani, N., Carpenter, E., & Arns, L. (2006) “3D sign language mathematics in immersive environment”. Proc. of the 15th IASTED International Conference, June, Rhodes, Greece. [6] Bell, J.T., & Fogler, H.S. (1996). “Preliminary Testing of a Virtual Reality Based Educational Module for Safety and Hazard Evaluation Training”. Proc. American Society for Engineering Education Annual Conference, Indiana Sectional Meeting, Peoria, IL. [7] Hainich R. R. (2009) “The end of hardware 3rd Edition, 2009: Augmented Reality and Beyond”, BookSurge Publishing. [8] El-Rabbany, A.(2002).”Introduction to GPS: The Global Positioning System”, Artech House, Inc., Boston, MA. [9] Titteron D, and Weston J. (2002) “Strapdown Inertial Navigation Technology”, Instution of Electrical Engineer, Michael Faraday House, UK. [10] Kaplan, E.D., and Hegarty, C.J. (2006) “Unserstanding GPS: Principles and Applicartions”, 2nd ed. Artech House, Inc., Boston, MA. [11] Chansarkar MM, Garin LJ (2000) , “Acquisition of GPS signals at very low signal to noise ratio”. In: Proceedings of ION national technical meeting 2000, Anaheim, CA, USA [12] G. Lachapelle, H. Kuusniemi, D.T.H. Dao, G. MacGougan, and M.E.Cannon. (2004) “HSGPS Signal Analysis and Performance under Various Indoor, Conditions”. Navigation, 51(1), pp. 29–43, [13] Chiang, K., Hou, H., Niu, X. and El-Sheimy, N. (2004) ”Improving the positioning accuracy of GPS/MEMS IMU integrated systems utilizing cascade de-noising algorithm”, ION GNSS 17th international technical meeting of the satellite division, Long Beach CA, pp.809-818. [14] Shin, E-H.(2005) “Estimation Techniques of Low-Cost INS-GPS for Land Applciation”, Ph.D thesis, Dept of Geomatics Eng., University of Calgary, Calgary, Canada. [15] Kondo S-I, Kubo N and Yasuda A. (2005) “Evaluation of the Pseudorange Performance by Usinng Software GPS Receiver”, Journal of Global Positioning System, No 1-2.
[16] Li, Y. Wang, J. Rizos, C.Mumford, P. Ding, W. (2006) “Low-cost tightly coupled GPS/INS integration based on a nonlinear Kalman filtering design”, Proceedings of ION National Technical Meeting. [17] A.Solimeno (2007) ”Low-Cost INS/GPS Data Fusion with Extended Kalman Filter for Airborne Applications”, Master thesis, Universidad Tecnica de Lisboa. [18] Falco,G., inecke,G.,Malos, J.T., Dovis, F. (2009) “Performance analysis of Constrained Loosely GPS/INS Integration Solutions”, IGNSS Symposium, Surfer Paradise, Qld, Australia, 1-3 December. [19] Wolf, R. Eissfeller, B. and Hein, G. (1997) “A Kalman Filter for the Integration of a Low Cost INS and an attitude GPS”. [20] Savage, P.G. (1996) “Introduction to Strapdown Inertial Navigation System,”, Vols 1 and 2, Strapdown Associates, Maple Plain, MN.