Integration of Haptics with Web3D using the SAI Liam Kurmos∗ Bangor University,UK.
Nigel W. John† Bangor University,UK.
Abstract
sive to purchase and have required low level programming skills to use effectively. However, affordable products are now starting to appear. In particular the emergence of the budget Falcon device (Novint Technologies, Inc., USA) represents a key step in the dissemination of haptics technology as it offers this functionality to home users, in particular to games players. We believe that the widespread use of haptics will be driven by the computer games industry. Games designers are released from the constraints imposed on medical devices, and consequently the games designers have greater freedom to develop engaging entertainment. Computer haptics is now at a stage where the basic framework of the technology has been established; users can move in 3D and feel various forces, and thus developers can begin to fully explore what can be done with this technology.
Haptics force-feedback technology is fast becoming a consumer product and no longer only found in research laboratories. The emergence of the budget Falcon device (Novint Technologies, Inc., USA) represents a key step in the dissemination of haptics technology as it offers this functionality to home users, in particular to games players. Haptics has the potential to revolutionise the Human Computer Interface if novel and creative software solutions can be found to utilise it. Currently developing for haptics requires low level programming knowledge, which is often a barrier to uptake. This paper looks at how haptics support can be integrated into an X3D authored virtual world using an open source haptics library via the Scene Authoring Interface (SAI). We supply a partial implementation of a Java wrapping to the HAPI open-source haptics library and provide a demonstration of its use within the Xj3D browser through SAI. This work is intended to contribute to a possible future haptics extension of the ISO X3D standard.
However haptic devices are generally difficult to program for. In particular, to create a haptic application the developer generally needs to understand low-level programming concepts. Consequently, such a requirement will inhibit the widespread uptake and utilization of haptic devices. The ubiquity and extensibility of X3D provides an ideal platform. By integrating haptics into X3D and providing an X3D authoring tool, developers will be able to easily and quickly author three-dimensional haptic worlds.
CR Categories: H.5.2 [Information Interfaces and Presentation (e.g., HCI)]: User Interfaces—Graphical User Interfaces, Haptic I/O;
1
Jonathan C. Roberts‡ Bangor University,UK.
Introduction This paper presents how haptic components can be integrated into an X3D authored virtual world using an open source haptics library via the Scene Authoring Interface (SAI). We supply a partial implementation of a Java wrapping to the HAPI open-source haptic library and provide a demonstration of its use within the Xj3D browser through SAI.
The sense of touch can be an important cue within a virtual world, and studies have shown that learning is enhanced when students are able to interact with their subject matter haptically, and able to better memorise force patterns when presented with them both visually and haptically rather than with either modality alone [Morris et al. 2007]. In fact, one of the primary application areas that has been driving research in Computer Haptics is the development of medical simulators for training, planning and rehearsal of procedures. E.g., Vidal et al. [2008] demonstrate how ultrasound guided needle puncture can be simulated with 3D textures and volume haptics. Telemedicine also has great potential to benefit from haptics [G¨uler ¨ and Ubeyli 2002],[Menachemi et al. 2004]. For instance, a surgeon might be able to operate robotic surgical implements to carry out a procedure remotely, relying on long distance communication (the Internet) to carry data to the patient’s location and provide haptic feedback to the surgeon. However, many of these haptic devices are specialized pieces of equipment.
To generate a haptic display the developer integrates three components: the haptic device, rendering of the haptic scene and the model. Section 2 provides some background information on haptics and describes various haptic devices. Section 2.3 describes some of the challenges to render the haptic scene. While Section 3 presents methods and tools that will be used to create the virtual model. The tools are generally available as C++ libraries, the two most popular open source solutions are Sensegraphics’ H3D/HAPI [Sensegraphics ] and Chai3D [Barbagli 2005] from The Open Source Haptics Project.
Commercial devices, especially medical devices, have become readily available in recent years but have been relatively expen-
To integrate with Web3D technologies, a Java API would be desirable. With this in mind we have implemented a Java Native Interface (JNI) wrapping of the HAPI haptics library that allows a Java programmer to access the C++ HAPI library. We discuss JNI and our JHAPI implementation in section 3.3. Beyond haptic support for Java, another way to facilitate haptic take up would be to add native haptic support to rapid application development frameworks such the X3D ISO standard [Web3D a] and in section 3.1 we look at how support for haptic nodes could be incorporated. If this can be achieved, the high-level nature of the scene graph will allow the programmer to simply specify haptic properties for virtual objects and leave the technical details of haptic rendering to the X3D browser and its supporting libraries. As an intermediate step we demonstrate the use of our JHAPI library within the Xj3D browser, which is described in section 5, where the library is accessed through Xj3D’s java-based Scene Authoring Interface (SAI).
∗ e-mail:
[email protected] [email protected] ‡ e-mail:
[email protected] † e-mail:
Copyright © 2009 by the Association for Computing Machinery, Inc. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail
[email protected]. Web3D 2009, Darmstadt, Germany, June 16 – 17, 2009. © 2009 ACM 978-1-60558-432-4/09/0006 $10.00
25
2
Background
In English the word ‘Haptic’ is an adjective meaning ‘of the sense of touch’. The precise origins in Greek of the word ‘Haptics’ is debated, possibly ‘Haptesthai’, ‘Hapthai’, ‘Hapte’, but there is a general consensus that it derives from a word (or word group) meaning: to fasten, to hold, grasp, to touch. Haptics covers two types of sensory perception: tactile feedback and force feedback. Haptic Displays are concerned with simulating the sense of touch using computers, primarily though force feedback. For example, in the real world, when a hand reaches out to touch a teapot the haptic forces that a user may feel are a natural consequence of the laws of Physics. When the teapot is virtual the forces generated are calculated by a computer and rendered via the haptic display device. Many different types of haptic displays have been produced.
2.1
Figure 1: Collision-detection routines provide information about contacts S occurring between the device probe at position X and objects in the virtual environment. Force-response routines then return the ideal interaction force Fd between the device probe and virtual objects. Control algorithms send a force Fr to the device, approximating Fd to the best capability of the device.
Haptic Force-feedback devices
The PHANTOM range from Sensable Technologies, Inc., (Wobern, USA) are probably the most widely used commercial haptics devices. The PHANTOM originates from research at the Massachusetts Institute of Technology (MIT)[Massie and Salisbury 1994r]. Other manufactures include Force Dimension (Lausanne, Switzerland) and Immersion (San Jose, USA). Designs of haptic devices vary from exoskeletal devices that operate an entire arm, to those for a hand, or even a finger tip. While some types are commercially available, there are many more types being investigated in the research domain. Many of the commercial devices are limited to generating a specific types of virtual haptic experience: in general the cheaper the device the lower the fidelity, and the less Degree of Freedom (DOF)s presented. Some devices like the Immersion CyberGrasp have up to 22 DOF, others such as the PHANTOM Omni have just three degrees of freedom of force output. Other attributes that are desirable in haptics feedback devices are the force they can produce, and moving freely with low friction and inertia and high stiffness of construction materials [Laycock and Day 2003][Massie and Salisbury 1994r][Burdea 1996][Ellis et al. 1996].
2.2
2.3
Haptic Rendering
For realistic results, the haptics rendering pipeline (see Figure 1 source: [Salisbury et al. 2004]) must run at a refresh rate of at least 1000Hz. This means that the algorithms that calculate the forces must ‘read’ the input from the device, calculate the new forces to be rendered and then send them to the device within 1ms. The graphics rendering pipeline runs in parallel and requires an update rate of 30Hz for real time. The simplest approach to haptic rendering is to consider the probe of the haptics device (known as the end effector) as controlling a single point called the Haptic Interface Point (HIP) and implementing a point based interaction within the virtual environment. The HIP is a single point of interaction, akin to probing with just the very tip of a finger. As the HIP is moved around the workspace, the computer simulates a point that moves around the virtual environment. The application reads the position from the device, it can use this to update its graphical representation of the HIP, which might be a small sphere centered on the HIP. The responsibility is with the haptic rendering pipeline to detect if the HIP is inside an object and if it has to calculate a force vector – the reaction force that a user should feel – and send it back to the haptics display [Massie and Salisbury 1994r]. A problem, however, is that no haptics device can produce infinitely hard virtual objects, and so some penetration of the HIP into the object will always be possible. Consider the case where the HIP touches the face of a fairly thin object. A reaction force is applied by the device but still the HIP penetrates the object to some extent. Since no record is kept of the previous position of the point, if it penetrates beyond half-way through the object it will be closer to the far face of the object. The algorithm will apply a force to push the point out through the far face and effectively the point will have passed through an object.
The Novint Falcon
For this work we have specifically utilised the Novint Falcon (which is based on Force Dimension’s Delta device). The Falcon represents the first 3D haptics product which is targeted at the mass consumer market. It is a 3 DOF translational (no orientation) device capable of greater than 1N of force. With a release price in the US of approximately $200, around 20% of the price of (say) the PHANTIOM Omni. The Falcon is essentially pioneering the market for haptics in the home. Novint’s primary target is the computer gamer, yet to successfully recoup their investment exciting haptically enabled high-profile games titles will need to appear on the market. If Novint are successful in disemintating the technology, they will be instrumental in ushering in the haptics revolution. Higher sales volumes and mass production will drive innovation and lower unit costs. In parallel there will be an increase of software support, such that haptics devices become an indispensible tool for Human Computer Interface (HCI), much as a mouse and keyboard are today.
Zilles and Salisbury [1995] introduced the concept of using the godobject, or point-proxy. When in free space, the position of the the HIP and proxy point are co-located. When the the HIP penetrates into an object, the proxy remains constrained to the surface of the object. When there is no friction on the surface, as the HIP moves the proxy will move over the surface in a way that keeps the distance between the HIP and the proxy at a minimum. The reaction for the contact is then always in the direction from the HIP towards the god-object. Numerical instabilities can emerge from very small gaps in a polygonal surface [Ruspini et al. 1997]. Also, because the god-object is a point-proxy it can sometimes pass through the mesh and into the interior of an object. To avoid the point falling through the mesh it is necessary to reconstruct the topology of the
Novint have released their Windows-based HDAL C++ SDK [Novint 2008] to allow developers to create the haptic applications that are essential to their strategy. An open source alternative is the libnifalcon library [LibNiFalcon ], which provides similar functionality to Novint’s SDK but is also cross platform, and has support for Linux. Through reverse engineering, the authors of libnifalcon have also provided a valuable resource on the internal workings of the Falcon [Machulis ]. Furthermore, a compatible HDL class allows the driver to be used instead of the proprietary Novint drivers.
26
3.2
surface [Laycock et al. 2007]. Alternatively, a finite sized sphere for the interaction point is used instead of a point as this will no longer fall through any gaps that are smaller than the diameter of the HIP. However, a larger point creates more computation, which can be slower to calculate.
The two popular current open source haptics rendering libraries are listed below. These libraries are not available for Java, however. The solution that we subsequently developed in this work has been to create a Java binging to one of these libraries using the Java Native Interface (JNI).
Finally, collision detection is fundamental to Computer Haptics – see [Hadap et al. 2004] for a useful survey on the subject. The primary difference between collision detection for haptics and collisions for graphics is that computer graphics collision has approximately 30 times longer in which to complete the calculations and also has access to dedicated hardware for performing computations.
3
CHAI (Computer Haptics & Active Interfaces) is a freely downloadable open source set of C++ libraries for computer haptics. It supports most commercially-available 3DOF and 6DOF haptics devices, and presents a simple interface for the addition of new force feedback devices. It is a light platform on which extensions can be developed.
Methods and Tools
3.1
H3D is an example of a haptically enabled virtual environment Rapid Application Development (RAD) platform based on X3D1 . It is produced and maintained by SenseGraphics (Stockholm, Sweden). The H3D API is written in C++ and uses HAPI (an opensource, cross-platform, haptics rendering engine written in C++) to create a unified scene graph of both haptics and graphics nodes (unlike OpenHaptics). H3D can be scripted in XML or Python as well as used as a C++ library.
X3D: eXtensible 3D graphics
X3D is the third-generation of the open web-deployable 3D graphics specification presented in International Standards Office (ISO) international standard ISO/IEC FDIS 19775-1.2:2008. It is also known as X3D 3.0 and is the successor of VRML 2.0 (ISO/IEC 14772-1:1997). X3D is maintained by the Web3D Consortium, a non-profit consortium formed from a broad cross-section of business companies, academic institutions, government agencies, and individual professionals. The consortium support several working groups formed by its members. 3.1.1
Although both Chai3D and HAPI could have been chosen for this work, the latter was selected for two principal reasons: 1. HAPI is used by H3D for it haptics rendering, and H3D is based on the X3D standard. 2. HAPI provides a modular haptics rendering framework and allows a choice of different rendering engines. Namely OpenHaptics (licence required), Chai3D rendering engine, an open source implementation of the God-Object renderer [Zilles and Salisbury 1995], and an open source implementation of the Ruspini renderer [Ruspini et al. 1997].
X3D Browser
X3D is an abstract specification that describes a file format and specifies the implementation of a scene graph as well as some internal components of a software browser capable of browsing it. The implementation of the browser is not prescribed and is left to the developers. The specification ensures that events are synchronized with wall-clock time rather than processor-clock time; i.e., objects in the environment will remain synchronised even if parts of the rendering algorithms change speed in the future [Brutzman and Daly 2007].
3.2.1
Device handling in HAPI
HAPI is modular and implements device handling in an abstract layer, making device specific implementations completely separate from the rest of the library functionality. HAPI provides a device independent class called HAPI::AnyHapticDevice. This is the only device class we have implemented in our java wrapping JHAPI (see next section). When using AnyHapticDevice, HAPI will use the first supported haptics device it encounters. Provided there is only one device connected this is the simplest approach and means that the code we use from our demonstration in section 5 will work if any of the supported haptics devices are connected. Another feature of this modular device layer is that it is straightforward to extend HAPI to support other devices such as the libnifalcon drivers – see section 2.2.
The X3D Scene Graph will dynamically evolve according to user input and other dynamic events. Event nodes, stored in the event tree, are triggered by user input and can be routed into other nodes such as animation interpolators. Browsers may also support ECMA Script or Java, and scripts can interact with both the event model and the scene graph, creating and manipulating nodes to produce a rich interactive experience for the user. 3.1.2
Open source Haptic-Rendering Libraries
The Scene Authoring Interface and Haptics
3.2.2
The SAI allows external access to the browser, such that a browser supporting SAI [Wed3D ] can be embedded within an existing application. Currently the only implementation of the SAI specification is in the Xj3D browser, a reference X3D implementation produced by the the Web3D Consortium. It is a Java based toolkit and it delivers a Java package org.web3d.x3d.sai that can be used to initialise a browser window from a Java application and create and manipulate the content in the scene graph. The Interfaces used by the SAI are documented in the SAI API [Web3D b], and the X3D specification [Web3D a].
HAPI thread handling
HAPI internally manages threads to guarantee haptic rendering at 1000Hz, and provides a mechanism for inter-thread communication. Figure 2 shows the flow of execution in the main HAPI haptics thread (source HAPI manual). HAPI may use threads provided by the device drivers of the specific device being used, however this is hidden from the general users. HAPI also provides functions to allow users to implement complicated threading arrangements for specific user needs, however here we use only the primary means of accessing the main haptic thread and don’t create any custom thread of our own. HAPI has a number of other features such as support
Using the SAI we have access to the X3D graphical environment from Java code. Therefore, the simplest way to implement haptics in an X3D environment is to use a Haptics rendering library from within the Java code.
1 Although the H3D API is open source, we do not refer to it as ‘freely downloadable’ as it is necessary to register with the project before the source code is made available.
27
Figure 4: UML Diagram showing a generic JHAPI Java class and its related functions in the compiled library.
Figure 2: The flow of execution in the main HAPI haptics thread.
J appended to their name. Thus the class HAPI::AnyHapticsDevice in HAPI is implemented in JHAPI as JHAPI.JAnyHapticsDevice. Figure 4 shows a UML diagram of a generic class in the JHAPI package and the functions in the C++ library that are used as an interface to the C++ object in the HAPI library. The key to implementing C++ objects as Java objects is storing the reference to the native object in the JVM as a primitive (for 32 bit systems it is sufficient to use an int for storing the pointer, however for compatibility with 64bit systems we use a primitives of type long to store the native pointer. Every JHAPI class has a private long nativePtr field and also a method createNative(). The constructor of the JHAPI class calls the method createNative(), passing any constructor arguments to the native library implementation of the createNative method (i.e., the method named Java JHAPI JHAPIClass createNative().) This creates a HAPI::HAPIClass object and casts the pointer to the object to a jlong, it then stores this jlong in the Java objects nativePtr field.
Figure 3: The JNI architecture.
for user definable surfaces and force effects, and also layering that allows multiple proxies to be used each responding to different layers – this can be used as a way to simulate a harder material within a softer one (such as bone beneath flesh).
3.3
The JNI is a framework in Java that provides an interface between code on the Java Virtual Machine (JVM) and native code, primarily code compiled in C/C++ – see Figure 3. The interface thus allows the invocation of native methods from within Java Objects, and similarly Java methods calls can be made from within the native environment. However, whilst Java objects and their primitive data can be accessed from within the native code, the reverse is not true, and no mechanism is provided for directly accessing data belonging to the native environment. The JNI is a powerful tool, as it removes all limitations from the Java programmer. However, many elegant aspects of using Java are voided once native code is used, perhaps the foremost being that the application is no longer platform independent. Furthermore, many of the features that Java programmers come to take for granted, such as type-safe code, automatic garbage collection and array-bound checking are not preserved by native code. Many of difficulties in using the JNI stem from the fact that it only allows the execution of individual methods, and so although it is possible to call C++ methods this can be done only through a C-style procedural framework. It is not possible to directly create or access a C++ object through the JNI.
4
Note that nativePtr can be declared final in Java code and the C++ code will still be able to alter its value. This is a useful way of protecting the pointer from being tampered with on the Java side.
The Java Native Interface
JHAPI: Java binding of the C++ HAPI haptics library
We call our wrapping of the HAPI library JHAPI (it means ‘hugs’ in Hindi). We provide JHAPI as a Java package. The classes from the C++ library are implemented as Java classes in this package with a
4.1
Accessing the JVM from native code
In addition to any arguments passed to a JNI native function from the Java code, on the native side the function will receive two other objects. They are a JNIenv* and a jobject. The JNIEnv*, usually given the name ‘env’ is a pointer to the Java environment presented by the JNI. It is fundamental to all interaction with the JVM from the native code. The jobject is a reference to the Java object in the JVM on which the native method was called.
4.2
Accessing native Objects
Here we describe the method we have used to simulate Java classes based on native classes. To store the address referenced by the pointer to the new C++ object the createNative method will look like code listing 1. 1 2 3 4 5 6
28
JNIEXPORT void JNICALL Java JHAPI JHAPIClass createNative(JNIEnv ∗env, jobject obj, jint exampleIntArg){ HAPI::HAPIClass ∗newHAPIObject = new HAPI::HAPIClass(exampleIntArg); jclass cls = env−>GetObjectClass( obj); jfieldID fid = env−>GetFieldID (cls, ”nativePtr”,”J”);
7 8
env−>SetLongField(obj,fid,(jlong)newHAPIObject);
30
Code Listing: 1
14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
public long getNativePtr() { return nativePtr; } private native void createNative(long[] nativePtrList, int lenNativePtrList, long surfacePtr, int convexOrd, int touchable faceOrd);//[9] } Code Listing: 2
Once this method exits all the native references to the object are lost. All we have is the area of the memory that was allocated and used to store the data for the object we created. When we want to call one of the methods on our Java wrapping of the object, we call a procedural method that first recovers the address of the object from the Java environment and then uses it to obtain a pointer of the right type. Once we have the pointer to the object we call the method on the object and pass it the required arguments. When the argument is also a JHAPI object we will have to obtain a pointer to the argument’s object in the same manner.
4.3
} private final long nativePtr = 0;
31 32 33 34 35 36 37
First we must get a reference to the class of the Java object, then we get a reference to the nativePtr field. We need to pass the GetFieldID method three arguments: the reference to the class we just obtained, the name of the method as a string and the field descriptor, which is also a string. Since nativePtr is a long we use the descriptor for a Java long which is “J”. Once we have a reference to the field we use it to store the address of the pointer we created 38 (which we cast as a jlong).
1 2 3 4 5 6 7 8 9 10 11 12 13
createNative(nativePtrList, nativePtrList.length, surface.getNativePtr(), convex.ordinal(), touchable face.ordinal());//[8]
}
To implement JHAPI, we echoed the same semantic structure in our JHAPI classes as in the original HAPI classes. However, the notable absence of Typedefs in Java being one of the features that made an exact parallel structure impossible. The following comments explain the numbered comments in Code Listing 2. 1. Whenever polymorphism is used we recreate the inheritance hierarchy in our Java objects. Here we inherit from JHAPIHapticShape so that we can pass JHapticTriangleSets to methods such as JAnyHapticsDevice.addShape().
Example C++ JNI interface
The JHapticTriangle set is used to create a haptic shape from triangle data. We use this interface to illustrate many of the features of using JNI. The Java object is mostly just a wrapper, a skeleton class that is fleshed out by methods for manipulating the native library. For example the constructor shown in Code Listing 2 calls the createNative method, which in the C++ code will instantiate a C++ object in memory before returning the address to be stored in the Java nativePtr long primitive.
2. Since the C++ class uses a C++ vector to store the triangle data for the haptic shape, the Java vector class is a good choice to implement this in JHAPI. However, this means we have to copy the data in the Java Vector to the C++ vector (see point7).
public class JHapticTriangleSet extends JHAPIHapticShape {//[1]
4. The FaceType enum is part of the Collision namespace in HAPI and so we implemented it as a static enum in the JCollision class.
3. The C++ code uses C++ enums to define constants such as the ConvexType and faceType. Since Java now has support for enums we decided to utilise these in JHAPI.
private Vector triangles;//[2] private JHAPISurfaceObject surface; private ConvexType convex;//[3] private JCollision.FaceType touchable face;//[4] public static enum ConvexType {//[5] CONVEX FRONT, CONVEX BACK, NOT CONVEX }
5. The declaration of the ConvexType enum. The ConvexType enum is defined in the HapticTriangleSet class in HAPI so we implemented it analogously in JHAPI. 6. To simplify the passing of triangle data to the native code, an array storing the address of all the triangle objects in the vector is created. 7. Loop over all the triangles in the vector and add their addresses to the array.
public JHapticTriangleSet(Vector triangles, JHAPISurfaceObject surface, ConvexType convex, JCollision.FaceType touchable face) {
8. The native createNative method is called giving it all the required arguments. 9. The definition of the createNative method. Note how the enums are passed as ints.
this.triangles = triangles; this.surface=surface; this.convex=convex; this.touchable face=touchable face;
The Java vector is converted into an array of triangle addresses and passed to the native JNI code, which will then create a C++ Vector of HAPI::Collision::Triangle objects. Obviously there is a certain amount of inefficiency here, but fortunately the constructor is not likely to be called frequently during runtime.
//pass triangles to JNI as array of pointers long [] nativePtrList;//[6] nativePtrList=new long[triangles.size()];
4.4 for (int i=0;iGetLongArrayRegion(trianglePtrArray,0, lenArray,ptrArrayBuffer);//[5] //loop over array copying triangles to vector for (int i=0;iGetObjectClass( obj); jfieldID fid = env−>GetFieldID (cls, ”nativePtr”,”J”); env−>SetLongField(obj,fid,(jlong)htset);
A function to call each frame: readableFieldChanged(X3DFieldEvent e)
The haptic thread is now set up and runs at 1000Hz as required. Next we need to be able to recover the position of the proxy, so that the graphical representation of the HIP is drawn and synchronised with the haptics and visual feedback. A god-object renderer was selected using a point as the visual proxy. However, in practice a small sphere provides a clearer visual cue. The position of the
} Code Listing: 3
30
6
proxy must be obtained every frame and used to update the scene graph. The best way to achieve this using the SAI was not immediately obvious. In the end the best method found was to create a TimeSensor node, the fraction changed field for which is updated once per frame, see Code Listing 4. The TeaPotDemo class implements the X3DFieldEventListener interface. See line 5 of Code Listing 5. Subsequently, the primary class (the this object) is set to be the action listener for this event. This ensures that the readableFieldChanged(X3DFieldEvent e) method will be called once per frame. The graphical proxy position can then be updated. 1 2 3 4 5 6 7
private void initFrameTimer() { frameTimer = (TimeSensor) mainScene.createNode(”TimeSensor”); ((SFBool) frameTimer.getField(”loop”)).setValue(Boolean.TRUE); SFFloat frameChanged = (SFFloat) frameTimer.getField(”fraction changed”); frameChanged.addX3DEventListener(this); mainScene.addRootNode(frameTimer); } Code Listing: 4
1 2
pos = renderer.getProxyPosition(); proxySphereTransform.setTranslation(new float[]{(float) pos.x ∗ magnification, (float) pos.y ∗ magnification, (float) pos.z ∗ magnification});
int stat = jhd.getButtonStatus(); Code Listing: 6
References
gravity=new JHapticForceField(new Vec3(0,−50,0));
B, 2005. Chai3d: An open-source toolkit for haptic rendering & applications. Presented as part of the course ’Recent Advances in Haptic Rendering & Applications’, at SIGGRAPH2005., July.
Code Listing: 7 1 2
The proof of concept demonstration performs well, and is indistinguishable from a purely native environment. Thus we have demonstrated that it is possible to perform haptic rendering from Java by implementing Java bindings to a C++ haptic library. This has been possible because the JNI code need only communicate with the native haptics thread once per graphics frame to read the position of the proxy, and once to apply the gravity force effect. Neither of these calls involve much computation. Similarly, JNI has also been successful in wrapping OpenGL with the JOGL library and it’s performance for interactive computer graphics has been validated.
Future work will investigate creating a wrapper for CHAI3D and to compare the results, performance and stability with JHAPI. H3D API and the ReachIn API have also extended the X3D (VRML in case of ReachIn) scene graph to include haptic nodes. It should be straightforward to take a similar approach and extend the ISO X3D standard to include specification of haptics. There is currently interest within the Web3D Consortium to establish a Haptics Working Group, which will be the catalyst for achieving this goal. We believe that the work described in this paper will make a valuable contribution to this initiative.
Code Listing 6 gives us the status of the buttons on the haptic device as an integer (note there is an issue with this function in the 1.0 release of the code for discussion see [Kurmos 2008]). The logic implemented ensured that when the lid is not being lifted and the button is pressed, if the proxy is within a certain Euclidean range of the tip of the lid, then add the gravity force and set a boolean to indicate the lid was lifted. The Gravity force is a class variable that we define is shown in Code Listing 7, which creates a force in the negative Y-direction of strength 50 (as shown in Code Listing 8. 1
Our use of the Novint Falcon has been successful, although JHAPI is designed to function with any haptics device that is supported by HAPI. The low price of the Falcon and its potential as a mass market device make it particularly promising, especially to investigate haptic Web3D functionality. However, an open source solution that is fully compatible with X3D would certainly stimulate more sales and wider usage than is currently likely with Novint’s own marketing plans. Their strategy is focused on gaining the attention of major games publishers, and buying the ‘touch’ rights to a title [Nutall 2008] so that they can implement the haptics support themselves. Whilst this approach has been very successful in getting simple haptic feedback in existing games, alone it will not persuade the games manufacturers to develop haptics-oriented games, which is where we believe the real potential of the Falcon lies.
Another question we set out to explore was whether Java wrapped HAPI is a good choice of rendering engine to use to provide native haptic support from within an X3D browser through the external SAI. We believe that this is partly proved as we have demonstrated that Java wrapped HAPI works well for touch interaction of static objects. However, issues of stability arose once we attempted to go beyond initialising static shapes, which appear to stem from bugs in the HAPI library. It is likely that at least some of these issues will be fixed by using the latest development version of the code. The lack of documentation for HAPI has also been a hindrance.
Code Listing: 5 1
Conclusions and Future Work
jhd.addEffect(gravity); jhd.transferObjects();
B, D., D, L. 2007. X3D: Extensible 3D Graphics for Web Authors (The Morgan Kaufmann Series in Interactive 3D Technology). Morgan Kaufmann, April.
Code Listing: 8
B, G. C. 1996. Force and Touch Feedback for Virtual Reality. John Wiley & Sons, Inc., New York.
Once the button is released, gravity is invoked to cause the lid to fall vertically down. The demonstration works well, with both haptics and graphics running flawlessly. There is no sign of any runtime performance degradation resulting from using the JNI. Additional implementation details and a description of related issues with non static objects can be located in [Kurmos 2008].
E, R., I, O., L, M. 1996. Design and evaluation of a high-performance haptic interface. Robotica 14, 321–327. ¨ G¨, N. F., U, E. D. 2002. Theory and applications of telemedicine. J. Med. Syst. 26, 3, 199–220.
31
H, S., E, D., V, P., L, M. C., R, S., E, C. 2004. Collision detection and proximity queries. In SIGGRAPH ’04: Course Notes, ACM, New York, NY, USA, 15.
W3D. SAI specification ISO/IEC CD 19775-2 Ed. 2:200x available at http://www.web3d.org/x3d/specifications/ISO-IEC-CD19775-2.2-X3D-SceneAccessInterface/index.html.
K, L. 2008. Integrating Haptics in Virtual Environments. Master’s thesis, Bangor University, School of Computer Science.
Z, C. B., S, J. K. 1995. A constraint-based god-object method for haptic display. In Intelligent Robots and Systems 95. ’Human Robot Interaction and Cooperative Robots’, Proceedings. 1995 IEEE/RSJ International Conference on, vol. 3, 146–151 vol.3.
L, S. D., D, A. M. 2003. Recent developments and applications of haptic devices. Computer Graphics Forum 22, 2 (June), 117–132. L, S.D., D, A.M. 2007. A survey of haptic rendering techniques. Computer Graphics Forum 26, 1 (March), 50–65. LNF. Libnifalcon page on sourceforge http://libnifalcon.wiki.sourceforge.net/ confirmed valid on 27/11/08. M, K. Technology blog http://www.nonpolynomial.com/s confirmed valid on 27/11/08. M, T. H., S, K. J. 1994r. Phantom haptic interface: a device for probing virtual objects. In Proc. 1994 International Mechanical Engineering Congress and Exposition, ASME, Massachusetts Inst of Technology, Cambridge, United States, vol. 55-1, 295–299. M, N., B, D. E., A, D. J. 2004. Factors affecting the adoption of telemedicine—a multiple adopter perspective. J. Med. Syst. 28, 6, 617–632. M, D., T, H., B, F., C, T., S, K. 2007. Haptic feedback enhances force skill learning. In WHC 07: Proceedings of the Second Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, IEEE Computer Society, Washington, DC, USA, 21–26. N. 2008. HDAL Programmers Guide. Novint Technologies Incorporated, Albuquerque, NM USA. N, C., 2008. Physical moves in virtual reality: financial time ft.com. story on the falcon http://www.ft.com/cms/s/0/4c7eaa0c8989-11dd-8371-0000779fd18c.html?nclick check=1 confirmed valid 28/11/08. R, D. C., K, K., K, O. 1997. The haptic display of complex graphical environments. In SIGGRAPH ’97: Proceedings of the 24th annual conference on Computer graphics and interactive techniques, ACM Press/AddisonWesley Publishing Co., New York, NY, USA, 345–352. S, K., C, F., B, F. 2004. Haptic rendering: introductory concepts. Computer Graphics and Applications, IEEE 24, 2, 24–32. S. H3d api. http://www.h3d.org. V, F. P., J, N. W., H, A. E., G, D. A. 2008. Simulation of ultrasound guided needle puncture using patient specific data with 3d textures and volume haptics. Comput. Animat. Virtual Worlds 19, 2, 111–127. W3D. The x3d specification iso/iec fdis 19775-1.2:2008 available at http://www.web3d.org/x3d/specifications/ISOIEC-FDIS-19775-1.2-X3D-AbstractSpecification/index.html confirmed valid on 28/11/08. W3D. The xj3d sai api web documentation available at http://www.xj3d.org/javadoc2/org/web3d/x3d/sai/packagesummary.html, confirmed valid 28/11/08.
32