Stochastic models for haptic texture - CiteSeerX

6 downloads 0 Views 290KB Size Report
stochastic based texture models using a 3 DOF point interaction haptic interface. ... Surface properties, such as friction and texture, can then be added through a ...
Stochastic models for haptic texture Jason P. Fritz Kenneth E. Barner Applied Science and Engineering Laboratories (ASEL) University of Delaware/Alfred I. duPont Institute Wilmington, DE email: [email protected], [email protected]

ABSTRACT Recent research in haptic systems has begun to focus on the generation of textures to enhance haptic simulations. Synthetic texture generation can be achieved through the use of stochastic modeling techniques to produce random and pseudo-random texture patterns. These models are based on techniques used in computer graphics texture generation and textured image analysis and modeling. The goal for this project is to synthesize haptic textures that are perceptually distinct. Two new rendering methods for haptic texturing are presented for implementation of stochastic based texture models using a 3 DOF point interaction haptic interface. The synthesized textures can be used in a myriad of applications, including haptic data visualization for blind individuals and overall enhancement of haptic simulations.

Keywords: haptic, synthetic texture, stochastic models, visualization 1. INTRODUCTION Haptic rendering involves the computation of forces to be generated by a force-re ecting interface to convey tactual representations of a virtual or remote environment. For point interaction haptic interfaces, such as the PHANToMTM , a single force vector is computed at the position of the input based on the interaction with the rendered environment. Solid objects are typically rendered by determining the point of surface penetration, obtaining the surface normal vector, and scaling according to a given force pro le. This force vector is referred to as the constraint force. Surface properties, such as friction and texture, can then be added through a vector function of this constraint force and the surface properties being modeled. Psychophysical studies have shown that surface properties are important for haptic exploration of objects. Therefore, friction and texture are used to enhance a haptic simulation by making it more realistic and by conveying additional information. Simulating these properties with a haptic interface, however, is a relatively new area of research. Fortunately, haptic texturing can draw from the experience of computer graphics texturing and psychophysical experiments on the human haptic system. Graphics provides insight into underlying procedures of computer generated textures, while the research in the human haptic system provides information about human perception of textures such as bandwidth measures and resolution. In terms of hardware implementation, there are two major constraints for developing haptic simulations. Namely, the force signal sent to the haptic device must have low power at high frequencies for hardware stability, and all computations must be done in real-time at a fast update rate (about 1 kHz). Computer generated texture research began with the desire for realism in computer graphics. To provide a means for psychophysical experiments and added realism in computational haptics, synthetic textures displayed through a haptic interface were developed. One of the rst systems was developed by Minsky, et al. and was called the Sandpaper System. Haptic texturing research then moved in the same direction as graphics texturing through the use of random processes as a primitive. Our work builds on this history to produce a wider variety of stochastic textures, and utilize them in a visualization application. Synthetic texturing can be used in a haptic data visualization 15

8, 20

13

3, 17

16

21

0 In Proceedings of the SPIE International Symposium on Intelligent Systems and Advanced Manufacturing - Telemanipulator and Telepresence Technologies III Conference, Boston, MA, November 1996.

Frslt

Fc

((((((((((((((((((( Ft ((((((((((((((((((( velocity ((((((((((((((((((( ((((((((((((((((((( ((((((((((((((((((( Ff ((((((((((((((((((( ((((((((((((((((((( ((((((((((((((((((( (((((((((((((((((((

Figure 1: Displacement mapping (top) and bump mapping (bottom).

Figure 2: Interaction force components.

simulation to extract information, analogous to the use of color in graphics visualization. For example, textures which can be adjusted with a few parameters can be modulated by the underlying data. The goal is then to produce distinguishable, information bearing textures, not speci cally to model real textures. The remainder of this paper is organized as follows. Section 2 discusses the general implementation methods used for haptic texturing, including related work in computer graphics and previous haptic methods. Two new techniques for haptic texturing are presented here. The following section covers the details of the stochastic models used for our texture generation schemes. Di erent ltering techniques are also presented to produce variations. Section 4 discusses the results of implementations and applications followed by a brief synopsis of our conclusions and future directions.

2. HAPTIC TEXTURING METHODS Creating computerized textures involves several steps. First, a coordinate space, which contains the desired texture, is de ned. This texture space is then combined with the object space, through sampling and coordinate transforms if necessary, into world space. There are a number of techniques to obtain the appropriate texture samples, and to process them for display to the user, as this section describes. The following section covers the details of generating the texture space from stochastic and/or deterministic models.

2.1. Graphics texturing The quest for realism in computer graphics has resulted in very realistic appearing objects through the use of shading and texturing. Any image can be considered a texture, but for our purposes, texture is a deterministic or stochastic pattern on an object that can not be displayed purely by shape. Graphical texture is created by calculating the light intensity on a surface as the result of light re ection models. The rst computer generated textures consisted of mapping 2D images onto a 3D surface. In this method, each 3D surface point has a corresponding 2D image point. Problems arise, however, when the 2D patch is warped onto curved surfaces such as spheres. Moreover, this method only changes the re ection intensity without disturbing the underlying smoothness of the surface. Other re ection models depend on the surface normal, so wrinkled and bumpy surfaces can be produced by perturbing the normal vector. This technique is known as bump mapping, Fig. 1, and requires speci c texture pattern data even for random textures. A similar technique known as displacement mapping modi es the actual surface, which also changes the normals and allows the bumps to be visible at silhouette edges. All of these techniques, however, require storage of pattern data and often possess discontinuity problems at the boundaries of texture patches. In the course of computerized texture evolution, a variety of methods have been developed using synthetic models for texturing, called procedural techniques. Many of these models create di erent textures through the control of a few parameters, and require little storage space, making them very versatile. Among these techniques are Fourier spectral synthesis and solid texture models. Spectral synthesis generates textures by modeling the power spectrum of a desired texture, and are very e ective at the expense of computation time required for the inverse FFT calculations. Solid textures are rendered on a uniform 3D lattice. The lattice provides independence of object shape, and is una ected by distortions of surface parameter space (e.g., poles of a sphere). Solid textures are also good for incorporating spatial dependencies and texture continuity. For example, an object carved from solid wood exhibits grain textures depending on orientation with respect to the growth of the tree. A solid texture can account for this, whereas a mapped texture will typically not have realistic relationships between adjacent patches, destroying 4

2

5

7

7

18

the illusion of continuous texture. Upon display to a 2D screen, only the visible surface and corresponding texture are rendered for a particular camera view or animation frame. This technique avoids spatial coherence problems prevalent in 2D mapping methods where the boundaries of the patches become visible. A derivative of this method is used to create \hypertextures" such as re, and water, where the texture has more in uence on the object shape. 19

2.2. Previous haptic texturing Generating haptic textures has many similarities with graphic texturing, but there are also several key di erences. First, haptics is a local sense, unlike vision, and only requires the knowledge of the tactile features in or around the area of contact. Haptic textures can be formulated as modi cations of the 2D image, bump, and displacement mapping, with a major exception being that there is no light model. Rather than determining a scalar value of intensity from a light source, an actual force vector is generated dependent only on the object itself. This can be accomplished by modifying the actual surface with a height map or by perturbing the normal vector. With the perturbation model, the texture force, Ft , and the friction force, Ff , are added to the normal surface constraint force, Fc , to form the resultant force, Frslt = Fc + Ft + Ff , as displayed in Fig. 2. Fundamentally, though, haptic texturing is similar to its graphics counterpart since both display a sample of a stochastic and/or deterministic function that can be mapped, or computed in a procedure, and will change the surface features. Recently, a simple experiment was conducted to determine which method, surface displacement or normal perturbation, contributes more to the perceived object shape, and to see if perturbing the normal vector appropriately could negate the e ect of surface displacement. Interpretation of the results concluded that displacement mapping was twice as e ective, and that bump mapping could not negate the displacement mapping. However, the perturbation vectors were always of unit length, making no attempt to simulate, or compensate for, surface distortion by changing the magnitude of the normals. Changes in magnitude of the normal vector will modify the roughness of a haptic texture, while only the angle e ects graphics textures, which use normalized vectors. This experiment did show, however, that normal vector perturbation is capable of conveying surface distortion without actually modifying the surface. A novel method of texture rendering with a 2 DOF haptic interface was developed by Minsky, et al. Using a height map of values on a rectangular lattice, lateral forces were determined from the gradient of the mapped surface. The forces opposing user motion were calculated as h ; (1) Ftx = k  x where x is the translation direction, h is the surface height (making h=x the local gradient), and k is a constant of proportionality. A similar scaled gradient is calculated for the y direction. This lateral force gradient algorithm produces a realistic simulation of bumps and indentations, and is considered a surface displacement technique. For generating texture grids and gratings, a procedural approach was taken (vs. mapping) since the periodic structure is easy to model with bandlimited, deterministic signals. A stochastic texturing technique to synthesize surface roughness for sliding objects was developed by Siira and Pai. Using a 2 DOF haptic interface, they obtained the current height of an implicit surface by sampling at points v n dt, where v is the velocity, n is a positive integer, and dt is the sampling time period. At each point, a random number is generated to represent the surface asperity, and then added to the sampled height. This surface asperity value is independent of the velocity, resulting in a constant distribution, regardless of the speed of the user. To assure that there is no texture force applied when the user is not moving, Ft is set to zero below a small velocity threshold. For a more realistic feel, the texture force magnitude, Ft , is made proportional to the constraint force magnitude, so the harder one pushes into the surface, the stronger the texture force displayed. 9

8, 17, 21

17

21





k

k

2.3. Implementation methods The haptic texturing in our investigations is developed upon the vector perturbation approach. We also use a velocity threshold and enforce the proportionality between the magnitude of the texture force and normal constraint force where applicable. The two implementation techniques presented here are the lattice texture and the local space texture methods. For additional exibility, all of the methods discussed are designed to be procedural in order to allow the textures to be functions of a data set, and to be tuned for desired results.

Gaussian vector field

1

0.5

0

−0.5

−1 20 20

15 15

10 10

5

5 0

0

Figure 3: 3D texture vectors on a 2D lattice.

Figure 4: 3D texture sampling lattice.

2.3.1. Lattice texture Generating the texture samples on a uniformly spaced integer lattice o ers several advantages. A 2D lattice, Fig. 3, provides a means for spatial ltering and introducing neighborhood dependencies of the samples on the lattice. A texture sample from a wood plank, for example, will exhibit certain dependencies on nearby samples in order to display the wood grain. At points in the lattice cells, the texture vector is a function of the sampled vectors on the neighboring lattice points (e.g., through interpolation). However, the 2D texture must still be mapped onto a 3D surface, which can cause the same warping problems seen in graphical mapping. A solid texture, akin to Perlin's method, can be generated on a 3D lattice, Fig. 4, making the texture independent of object shape, where Ft = k Fc . Flow textures can also be simulated on a 3D lattice (similar to \hypertexture" ) as a 3D vector force eld. Examples of ow textures include moving water and thermal radiation, where temperature gradients are transformed to force gradients. Haptic rendering of a lattice texture requires the knowledge of the occupied lattice cell, which is determined from the haptic rendering scheme used, and is related to the location of the haptic interface position. 18

k

k

k

19

k

8, 20

2.3.2. Local space texture The local space technique takes advantage of the local nature of haptics. In many cases, it may not be necessary to consider global relationships among texture samples. A lattice is still de ned in the texture space coordinate system; however, only the texture samples in a local region are known. For example, consider a window in the previously discussed 2D lattice. In a given cell, the only known texture samples are on the lattice points of that cell, and if necessary, points of the adjacent cells. For a di erent con guration, consider each cell containing one sample de ned at the center of the cell. In this case, the neighboring samples must be acquired for ltering in order to assure continuity of the texture force vector across cell boundaries. Extending this local lattice technique to a 3D lattice is straightforward, and accomplished by considering the cell layers above and below the cell occupied by the interface point. At the expense of using more samples, the 3D lattice texture becomes spatially coherent and independent of object shape. It has been demonstrated, though, that realistic textures can be generated which only require the samples of the occupied cell. As described above for the global 3D lattice methods, this also applies to haptic texturing. Techniques for assuring that the same texture vectors will be sampled given the same surface coordinates (repeatability), and implementation and ltering issues are discussed later. A variation of this local lattice method is accomplished by only de ning a sampling distance that is independent of the sample path. This method is similar to the temporal sampled method except that the sampling interval is de ned by a spatial distance, r. For 2D sampling, r de nes the radius of a circle on the plane tangential to the surface at point p, as Fig. 5 shows. Once the user touches the surface, that point, p, is stored and a texture vector is generated. To determine when the next texture sample should be obtained, the user controlled point is projected onto this tangential plane. While this point is within the circle de ned by the center coordinates p and radius r, the texture force is a function of the same sample. At the rst clock cycle after the projected point breaches this circle, that point becomes the new p, and a new texture sample is obtained. The process then continues, as Fig. 6 displays. 7, 18

21

r p

Local texture sampling: actual sampling path

tan. plane

r

p IP

Local texture sampling: 1D interpretation

Figure 5: Local circle on a curved surface. IP is the haptic Interface Point.

Figure 6: 1D interpretation of a 2D local circle/sphere sampling path.

This technique does not involve mapping a 2D texture array onto a 3D surface, so it is independent of the overall object geometry. Extending this technique to three dimensions is also straightforward, and would alleviate problems on surfaces with sharp curves. The radius, r, now de nes a sphere, and there is no need to determine the tangential plane and project the interface point onto it. Sampling resolution is controlled by r, and the Nyquist criterion should be considered, especially when sampling broadband deterministic or probabilistic signals. As r approaches zero, this method approaches a purely temporal sampling method. Advantages of this local sphere method include independence of object shape and minimal memory requirements. Unfortunately, spatial dependencies are more dicult to implement since the samples are independent of direction, and without a xed integer lattice. Texture repeatability is also not guaranteed. 1, 15

3. GENERATING RANDOM TEXTURES With the knowledge of the sampling methods described in the previous section, this section discusses the texture generation process. For purposes of discussion, consider a volume of texture that is a continuous 3D vector eld. This is the texture space. A sample at any point in this volume is a 3D vector (the texture force vector) with no constraints on magnitude or direction. This continuous texture space can be generated by any number of processes, which can be deterministic, stochastic, or mixed. One method of describing the features of a texture is through the its power spectrum, where modi cations of the spectrum produce various textures. We consider models based on primitives, and methods for spectral shaping.

3.1. Stochastic models An active area of random texture modeling research focuses on developing stochastic representations. These models are often characterized by the measured statistics of a given texture image, where low order statistics are often sucient. Examples of statistical measures include autocorrelation, edge density, histogram features, and random eld models. One class of stochastic models utilizes the autocorrelation function to develop a lter. The lter is then driven by white noise from a known probability density function (pdf) to synthesize the texture pattern. In the present setup, our goal is not to simulate real textures, but to produce textures that are perceptually di erent from each other. We can therefore take the converse approach, and generate textures by designing transormations of basic primitives. The simplest primitive to use for a model, as various techniques have indicated, is a white noise process from a chosen pdf (e.g., uniform, Laplacian, Gaussian). For texturing purposes, the noise primitive should have several properties; it should be bandlimited (\pink noise"), stationary (translationally invariant), isotropic (rotationally invariant), and exhibit no obvious periodicity or regular pattern. Filtered white noise possesses these properties. Gaussian white noise, in particular, has additional desirable properties. A linear transformation of a Gaussian random vector remains Gaussian, it occurs frequently in nature, and it is completely de ned by its rst and second order statistics. Just by modifying the variance, and hence the total power of the spectrum, a surface texture can range from smooth to rough, as implemented by Siira and Pai. 14

11, 22

7

14

21

0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0 −10

−8

−6

−4

−2

0

2

4

6

8

10

Figure 7: 1D Gaussian mixture densities.

Figure 8: 2D Mixture of 2 Gaussian pdfs.

Now consider a multivariate or multi-dimensional noise primitive that produces the perturbation vector. Simple textures can be produced by generating the force vector from a multivariate pdf. A special case occurs when the statistics of a pdf in each dimensions are independent, resulting in a diagonal covariance matrix. For example, the variance of the z component can be set to zero to produce only lateral force textures on an xy plane, and the variance in the x direction can be higher than in the y direction. For increasingly complex textures, a desired continuous pdf can be accurately approximated by mixing a number of Gaussian pdfs. This Gaussian mixture is expressed as 6

P

f (x) =

M X a i

iN (i ; Ci );

(2)

=1

where M i ai = 1 and ( ) is the ith multivariate Gaussian pdf with mean vector i , and covariance matrix Ci . The weighting factors, the ai terms, can be arbitrarily chosen, or modeled from a real texture. Figure 7 displays a simple 1D Gaussian mixture and a representation of a 2D texture with two pdfs is shown in Fig. 8. For procedural techniques, a large M limits the exibility of \tuning", although a large number of variable pdfs is not necessary in order to produce a wide range of textures. All of these con gurations, while exhibiting potentially complex pdfs, still produce independent and identically distributed (iid) samples, and thus have a constant power spectrum. Filtering techniques can then be applied to these signals to \shape" the power spectrum with consideration to the desired spectrum and the power constraints of the haptic interface. These techniques include statistical transformations in addition to traditional ltering. Fractional Brownian motion (fBm), for example, is a scale invariant process described by a self-similarity parameter, h, and a 1=f power spectrum. The texture samples can originate from a Gaussian pdf, where the variance of a sample Sm midway between two samples with variance  decreases by a factor of 2?h . Such scale invariance allows an object to have the same texture at any resolution. Another statistical ltering technique is to simply introduce correlation in the noise process. The components of the texture vectors can be correlated with each other, or the vectors can be correlated with another vector eld such as surface gradients. Traditional ltering methods are discussed after introducing an alternative method for creating a desired power spectrum. =1

N 

7, 10

2

3.2. Implicit procedures and Fourier spectral synthesis Using implicit procedures as basic building blocks, it is also possible to create a desired power spectrum. The simplest procedures use sinusoids, since they are impulses in the frequency domain. From Fourier theory, any periodic signal can be constructed by summing sinusoidal functions. Therefore, creating a texture by adding sine waves is a method of Fourier spectral synthesis, where an exact spectrum can be synthesized with an in nite number of sinusoids. The characteristics of the spectrum are adjusted through the amplitude and frequency of the sinusoids, and additional spatial e ects are created with phase changes. Other periodic, implicit functions can be used such as sawtooth or square waves, which will add higher frequency components at harmonic intervals. Recall that higher frequencies are acceptable with haptic interfaces provided the power at those frequencies is suciently low. Signals of this nature become vibrations in a haptic interface, which can enhance a texture.

white noise known pdf(s)

mixer

deterministic/ periodic signal

texture samples

H(z)

parameter adjustment

Figure 9: 2D sinusoid with multiplicative and additive noise.

Figure 10: Block diagram of texture modeler.

There are a number of ways to use multivariate, implicit functions for de ning a texture. A simple case uses the implicit functions to modify the surface height, where the perturbation vector is coincident with the normal. Anderson utilized this technique successfully by summing sinusoids and square waves de ned on a surface. A set of two sine waves and one square wave was used for each dimension in order to incorporate directional features, such as wood grain. Another possibility is to compute the local gradient of the textured surface through di erentiation of the implicit functions. If the period is suciently large, periodic functions can be used to create a texture that appears to be random. Another way to create random textures with periodic functions is to incorporate a stochastic process. The texture force vector, Ft, at position p can be expressed as 1

7

Ft (p) = Sg(p) + N; (3) where g(p) is an implicit, deterministic function, S is multiplicative noise, and N represents additive noise. Multiplicative noise can be used to randomly scale a deterministic function for further texture variation, Fig. 9, which is not typically found in real textures.

3.3. Filtering Filtering for haptic texture generation is used to shape the power spectrum of the texture primitive, while keeping the high frequency power suciently low. A qualitative texture classi cation can roughly translate into a spectral description. For example, textures may be rough, smooth, course, ne, granulated, rippled, regular, or irregular. Some categories denote periodicity, while others imply force amplitudes (spectral power) or a range of frequency components. By manipulating the power spectrum of the input primitives, these qualitative descriptors can be changed. Figure 10 displays the block diagram for the general texture modeler, where H(z) is the total transfer function of the ltering operation. With these goals and classi cations in mind, a variety of ltering techniques can be employed. A linear, FIR lter is easy to design and implement, and o ers a fair amount of exibility. E ective spectral shaping of white noise in speci ed bands can be accomplished, for example, using a bank of controllable bandpass lters (a parametric equalizer). The controllable parameters for each lter are center frequency, bandwidth, and gain, providing amplitude regulation over a selective band of frequencies. Better spectral control can be achieved using IIR lters at the expense of design complexity and stability sensitivity. When IIR lters are fed by a noise process, they are also known as autoregressive moving average (ARMA) models. ARMA models have been used to represent random elds among other processes. As mentioned previously, the texture force must be de ned at points not on the lattice using interpolation. A linear interpolation scheme is commonly used to lter the transition between samples, but it can produce discontinuous derivatives, resulting a sudden change in the angle of the texture vector. At the cost of a few more computations, 11, 17

11, 12

quadratic or cubic interpolation techniques can be used for a smoother force signal. Using the local sphere method, where spatial relationships beyond distance do not exist, the next sample can be generated. This sample will become the new texture vector at a distance of r from the current sample, so interpolation can be performed between the two. If interpolation is used here, lowpass ltering with the previous sample is not necessary because the interpolation provides adequate smoothing.

3.4. Implementation issues There are several implementation issues that arise in utilizing a noise primitive and lters for haptic texture generation. Namely, these are computation time, memory requirements, and repeatability. To speed up the time for pseudo-random vector (PRV) generation, and transformation to the desired pdf, the PRVs can be pre-generated for every lattice point. If every sample is not absolutely necessary, as they are for global sample dependencies for example, memory can be saved by using a look-up table (LUT) or hashing function. Graphics texturing methods have shown that 256 table elements are adequate to avoid noticeable pattern repetition. Being a local sense, haptics is even less sensitive to this repetition, so 256 entries are adequate here as well. If multiple pdfs are used, though, an LUT must be generated for each one. Since true random processes are not dependent on an input value, repeatability of a texture is achieved through the de nition of a noise function. This function takes a position as input, and associates it with an index into the LUT, similar to Perlin's Noise function, where Noise (x; y; z ) returns the LUT texture value. Filter implementation also involves time-memory tradeo s. Pre- ltering will save time during the simulation at the expense of memory. This pre- ltering is straightforward if it is performed on an implicit function or pre-generated lattice, but pre- ltering is not feasible if the LUT technique is used, since the table values have no spatial relationships. One solution is to de ne the lter kernel center as the occupied lattice cell, that moves with the haptic interface position. For example, all of the samples within a de ned radius can be weighted by a spherical Gaussian function before interpolation. Permanent replacement of LUT values from a local lter kernel will cause global changes, though. Therefore, the lter must be limited to an FIR con guration, which is not necessary if all lattice samples are stored. Another solution is to implement techniques equivalent to graphics methods for solid textures. 7

18

18, 7

4. RESULTS Using the techniques discussed in this paper, simple stochastic textures have been rendered. Our system consists of the PHANToMTM connected to a 120 MHz Pentium PC, and graphical display on an SGI Crimson to save processing time on the PC. To eliminate time for computing coordinate system transformations, objects were rendered directly in world space, as were the texture force vectors. The use and modi cation of di erent sampling and ltering methods, in addition to a friction model, created an assortment of textures. The rst textures were implemented on a 2D lattice on a plane. The textures produced by a Gaussian noise primitive were rough like granite or gravel. Only linear interpolation was used to lter the signal, which presented the need for higher order interpolation when the sampling resolution was high. The resolution, and thus the feature size, was modi ed through adjustments in the lattice spacing in each dimension. At larger spacing, the texture became \blocky", as expected. An exception resulted when only one direction had a large sampling interval. In that case, a grain-like texture resulted, and the \blockiness" was not felt because the cell width was too small to avoid moving into an adjacent cell. Directional ltering can be used to modify the resolution, to a certain extent, without this aliasing e ect. The friction models that were also implemented included both Coulomb and viscous friction. Both models had adjustable coecients, and the viscous friction could also be made proportional to the constraint force in addition to velocity, similar to Anderson's model. Statistical alterations of each texture vector component was also simple to implement since each component was generated independently. As previous researchers discovered, higher variance produced a rougher texture. We also noticed that using only lateral components formed a texture that was perceived to be indi erent from a texture using only normal components when the feature size was small. Adjusting the mean of the vector components caused the texture force eld to point in the direction de ned by the mean shift (+ or - for each component). This created an \active" texture analogous to rough water ow. Texture on a 3D lattice was then a natural extension. The implementation of the ow textures proved to be very 15

1

17, 21

a)

b)

Figure 11: Data modulated texture. a) original surface b) modulated texture (exaggerated to show detail)

a)

b)

Figure 12: False texturing. a) original image b) textured image interesting because a texture that felt like gravel was rendered in a volume without solid surfaces. Since there was no constraint force the variance was e ectively constant providing a di erent e ect than from the surface texture. This texture implementation led to the con guration to represent 3D vector elds for visualization. Drag friction, which is proportional to velocity squared, was added for a more \realistic" sensation. Due to the large number of stored samples, this method also motivated the need for local sampling techniques. Generating texture vectors on the y was the main advantage of the local sphere technique, requiring storage of only two vectors. The previous sample was also stored for smoothing the transitions using three Butterworth IIR lters (one for each vector component). Varying degrees of roughness were achieved through a combination of variance and lter cuto frequency alterations. This technique also worked well on polyhedral surface models with friction. Resolution adjustments also produced varying feature sizes, but considering the features were spherical, it did not seem to be as noticeable as the \blocky" e ect of the lattice methods. These observations led to techniques that can be implemented for the purpose of data visualization. First, consider a surface representing a data set. Add one texture to it, and modulate a parameter of that texture according to changes in the data, as seen in Fig. 11. For the graphical display, the variance was modulated by the surface magnitude. The range of stable variance (spectral power) is limited, so an addition method of to conveying the same information is to also modulate the coecient of friction with surface amplitude. Another method, which was brie y mentioned above, would be to de ne a correlation between the texture vectors and the gradients of the surface at the same 8

v

k1 position

object/ data

X

+

Fc + k2

texture model

Ff

friction model

d/dt

local samples

X interpolation

Σ

Frslt

Ffinal LPF

+

Ft

hardware stability filter

Figure 13: Block diagram of the visualization system. sample locations. Haptic texturing can also be used for a haptic representation of regions of interest in the same fashion that false coloring is used in images. A simple example is seen in Fig. 12, where each gray level corresponds to a di erent texture. The implementation of this this type of example involves de ning a region of the data, possibly through segmentation, and associating it with a particular texture. The system block diagram for this application is displayed in Fig. 13, where the texture model can be adjusted to the data as a pre-processing step. The dashed lines indicate that the object/data can adjust model parameters for both texture and friction. Since both the friction and texture forces can be proportional to the constraint force, the scaled Fc is incorporated after Ff and Ft are determined from the respective models.

5. CONCLUSIONS AND FUTURE WORK The techniques discussed in this paper, and the preliminary results obtained, show that a wide variety of textures can be simulated through the use of simple random and pseudo-random models, linear lters, and the addition of friction. Filtering not only provides texture variation by modifying the power spectrum, but also helps to keep the texture force signal from causing instability problems in a haptic interface. These textures may not represent real textures, but are suitable for data visualization where the important feature is perceptable di erences. Haptic texture research has only just begun, and there is a lot of opportunity for future work. For our application, psychophysical studies need to be conducted to determine how distinguishable textures are, and which methods are better suited for informationextraction. The implementationtechniques will also require more study and optimization to meet time and memory requirements e ectively. More reseach should also be conducted on the use of ltering methods for texture variation, including the implementation of the equalizer and fractal texture models.

6. ACKNOWLEDGEMENTS This project is funded by the National Science Foundation, Grant # HRD-9450019, with additional support from the Nemours Foundation Research Program.

7. REFERENCES 1. T. Anderson. A virtual universe utilizing haptic display. In The First PHANToM User's Group Workshop, Dedham, MA, Sept. 1996. MIT Research Lab of Electronics/Arti cial Intelligence Lab. 2. J. F. Blinn. Simulation of wrinkled surfaces. Computer Graphics (SIGGRAPH '78 Proceedings), 12:286{292, Aug. 1978. 3. G. Burdea. Force and Touch Feedback for Virtual Reality. J. Wiley & Sons, New York, NY, 1996. 4. E. E. Catmull. A subdivision algorithm for computer display of curved surfaces. PhD thesis, Department of Computer Science, University of Utah, Dec. 1974. 5. R. L. Cook. Shade trees. Computer Graphics (SIGGRAPH '84 Proceedings), 18:223{231, July 1984.

6. J. R. Deller Jr., J. G. Proakis, and J. H. L. Hansen. Discrete-Time Processing of Speech Signals. Macmillan Publishing Company, 1993. 7. D. S. Ebert (editor), F. K. Musgrave, D. Peachey, K. Perlin, and S. Worley. Texturing and Modeling: a Procedural Approach. AP Professional, Boston, MA, 1994. 8. J. P. Fritz. Haptic techniques for scienti c visualization. Master's thesis, Department of Electrical Engineering University of Delaware, 1996. (Work in progress). 9. Y. Fukui. Bump mapping for a force display. In The First PHANToM User's Group Workshop, Dedham, MA, Sept. 1996. MIT Research Lab of Electronics/Arti cial Intelligence Lab. 10. S. Haruyama and B. A. Barsky. Using stochastic modeling for texture generation. IEEE Computer Graphics and Applications, pages 7{19, Mar. 1984. 11. A. K. Jain. Fundamentals of digial image processing. Prentice-Hall, Inc., New Jersey, 1989. 12. R. L. Kashyap. Characterization and estimation of two-dimensional ARMA models. IEEE Transactions on Information Theory, IT-30(5):736{745, Sept. 1984. rst received August 16, 1982. 13. R. L. Klatzky, S. Lederman, and C. Reed. There's more to touch than meets the eye: the salience of object attributes for haptics with and without vision. Journal of Experiemental Psychology: General, 116(4):356{369, 1987. 14. C. G. Looney. Electrical Engineering Handbook, chapter 67, pages 1488{1510. CRC Press, 1993. 15. T. Massie and J. K. Salisbury. The PHANToM haptic interface: A device for probing virtual objects. In ASME Winter Annual Meeting, Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 1994. 16. M. Minsky, O.-y. Ming, O. Steele, F. P. Brooks, and M. Behensky. Feeling and seeing: issues in force display. In Proceedings of the Symposium on 3D Real-Time Interactive Graphics, volume 24 of Computer Graphics, pages 235{243, New York, 1990. ACM, ACM. 17. M. M. Minsky. Computational Haptics: The Sandpaper System for Synthesizing Texture with a Force-Feedback Haptic Display. Doctor of philosophy, Massachusetts Institute of Technology, 1995. 18. K. Perlin. An image synthesizer. Computer Graphics (SIGGRAPH '85 Proceedings), 19(3):287{296, July 1985. 19. K. Perlin and E. M. Ho ert. Hypertexture. Computer Graphics (SIGGRAPH '89 Proceedings), 23:253{262, July 1989. 20. K. Salisbury, T. Massie, D. Brock, N. Swarup, and C. Zilles. Haptic rendering: Programming touch interaction with virtual objects. In ACM Symposium on Interactive 3D Graphics, 1995. 21. J. Siira and D. K. Pai. Haptic texturing - a stochastic approach. In International Conference on Robotics and Automation. IEEE, 1996. 22. A. Speis and G. Healey. An analytical and experimental study of the performance of Markov random elds applied to textured images using small samples. IEEE Transactions on Image Processing, 5(3):447{458, Mar. 1996.