proprioception and force feedback

5 downloads 0 Views 1MB Size Report
to measure the positions and contact forces of ... the precision with which humans can control a force, and ..... Force and Touch Feedback for Virtual Reality.
PROPRIOCEPTION AND FORCE FEEDBACK Roope Raisamo and Jukka Raisamo Multimodal Interaction Research Group Tampere Unit for Computer-Human Interaction Department of Computer Sciences University of Tampere, Finland

Outline

• Force sensing and control • Force feedback devices • Haptic rendering

1

Force sensing and control

2

Force feedback interface

• Force feedback interfaces can be viewed as having two basic functions (Tan et al.,1994) – to measure the positions and contact forces of the user’s hand (and/or other body parts), and – to display contact forces and positions to the user.

3

Haptic interface design experiences • The major perceptual issues are the following: – – – –

force sensing under quasi-static and dynamic conditions, pressure perception, position sensing resolution, and the level of stiffness required for rigidity simulation.

• The major manual performance issues are the following: – the maximum forces humans can produce, – the precision with which humans can control a force, and – the control bandwidth of the force.

• The issues of ergonomics and comfort are also important 4

Force sensing

• In order for the user to perceive the forces displayed by the device as smoothly varying, the force display resolution of the device should match or exceed human sensing resolution. – just-noticeable-difference (JND) for human force sensing is around 7%

5

Unintended vibration • One of the most noticeable disturbances in a force reflecting device is the level of unintended vibration. – A significant level of vibration can quickly destroy the feeling of free motion or disturb the perception and control of virtual objects in contact. – The human body is very sensitive for vibration thus careful attention to hardware design and control software is needed. 6

Pressure perception 1/3 • Many force-reflecting exoskeletons are attached to the user’s forearm in order to display contact forces at the fingertips. – An unbalanced contact force at the fingertip is mechanically grounded at the forearm, and not supported by the entire upper body as it is in the real world. – The rest of the user’s body does not experience the effects of this force.

7

Pressure perception 2/3

– The perceptual effectiveness of such a nonrealistic display is unknown. – The true ground illusion might be successfully produced if the pressure distribution and its changes at the grounding location are below the absolute detection and discrimination thresholds, respectively, for the human user.

8

Pressure perception 3/3 • Pressure JND decreases as a function of contact area. – People are more sensitive to pressure changes when contact area is enlarged. – Independent of location

• The contact area at the attachment points of exoskeletons for mechanical ground should be minimized and the perimeter of the area should be maximized, in order to improve the illusion of true grounding.

9

Position sensing resolution • The position sensing resolution that is desired in the device depends on the position resolution of the human operator. – Joint angle resolution is better at proximal joints than at distal ones. • JNDs for joint angle resolution: Wrist (2.0°), elbow (2.0°) and shoulder (side and front both 0.8°)

– Help humans to control end points (i.e., fingertips) accurately. • An error of 1° in shoulder joint angle sensing would result in an error of 1 cm at the index fingertip • Concerning finger joint, the corresponding error would be 0.08 cm 10

Stiffness • Mechanical behavior of most solid objects in a virtual world is modeled with elastic stiffness. – Many virtual objects are supposed to appear rigid. – In the virtual environments stiffness and rigidity are perceptual notions that need to be decided casespecifically.

• What is the stiffness required to convince a user that an object is rigid? – Perceived rigidity depends on both the stiffness of the interface hardware and whether the comparison is done among a set of virtual object simulations or between a simulation and a real object. – For example, users can consistently judge the relative stiffness of different virtual walls even though they are never as rigid as the real walls due to hardware limitations. 11

Force output range • To match human performance, the maximum force exerted by the device should meet or exceed the maximum force humans can produce. • A study on the maximum controllable force humans can produce with joints of the hand and the arm. – The subjects were asked to exert a maximum force and maintain it for 5 seconds. – Maximum force increased from the most distal joint (finger) to the most proximal joint (shoulder) – Very good control over force output with the shoulder joint compared to finger joint. 12

Force output resolution • The resolution at which the force on a joint linkage must be controllable. • In order to present a perceptually smoothly varying force, the forces displayed by the device must be controllable to at least the level at which humans can sense and control force. • The precision with which humans can produce a mid-range force with joints of the hand and the arm. – The resolution is somewhat lower for proximal joints but they can control a lot larger forces.

13

Force bandwidth • The force control and perceptual bandwidths of a human operator are quite different. – Vibrotactile stimuli can be perceived up to about 500 Hz. – The upper bound of force control bandwidth is from 20 to 30 Hz. – However, the actual bandwidth is considerably less , and has been reported to be about 7 Hz. • The bandwidth of the device when it is backdriven by the human operator should at least match the force control bandwidth of the operator. 14

Ergonomics and control

• Sizing and fatigue are also important issues, especially for exoskeletal devices – Bad ergonomics can easily ruin otherwise excellent haptic display.

15

Human haptics

• Kinesthetic information – Net forces along with position and motion of limbs – Coarse properties of object – Large shapes, spring-like compliances

16

Degrees of Freedom (DOF) • Degrees of freedom are limited: – Degrees of freedom for force feedback display • 6DOF for position and orientation • Large number of joints and contact points

– Contact point: grasping versus stylus. – Usually focus on hands: resolution, detection, discrimination ability increase distally. – However, movement speed decreases if natural movements are constrained.

17

Human arm degrees of freedom

Caldwell et al. (1995) 18

Force feedback for display of texture and shape B

A

• In figure A, the sensation of a textured surface can be produced via a stylus that moves according to the virtual surface texture • In figure B, a stylus can be used to probe the surface characteristics of a virtual object 19

Analysis of grasps

Cutkosky and Howe (1990) 20

Sensing and control bandwidth • Sensing bandwidth – Frequency with which tactile and kinesthetic stimuli are sensed.

• Control bandwidth – Speed with which humans can respond.

• Sensing bandwidth is much larger than the control bandwidth. • Output loop or the control bandwidth has a typical force response of 5-10 Hz bandwidth while the kinesthetic / proprioceptive sensing bandwidth is about 20-30 Hz. 21

Sensing and control bandwidth

22

23

24

Physical modeling for virtual objects Compliance & texture

Surface deformation

Haptic interface control

Grasping

Hard contact

Physical constraints

Collision detection HAPTIC INTERFACE

(Adopted from Burdea, 1996) 25

Haptic system architecture User Visual, audio feedback Haptic feedback

Distributed computing platform

High-level control

Haptic interface

Interface controller Low-level control

Bi-directional energy flow One-way information flow 26

(Adopted from Burdea, 1996)

Force Feedback Devices

27

Force feedback devices • 1 Degree of freedom – Steering Wheels – Hard Driving (Atari) – Ultimate Per4mer (SC&T2)

• 2 Degrees of freedom – – – – – – –

Pens and Mice Pen-Based Force Display (Hannaford, U. Wash) MouseCAT/PenCAT (Hayward, Haptic Tech., Canada) Feel-It Mouse (Immersion) Joysticks Force FX (CH Products) Sidewinder Force Feedback Pro (Microsoft)

28

1D and 2D gaming devices

29

PenCAT

30

Force feedback mice

• In 1999 Immersion Corporation’s Feel-It force feedback mouse was introduced as Logitech Wingman Force Feedback Mouse

31

Force feedback devices • 3 Degrees of freedom – – – –

PHANToM (SensAble Technologies) ForceDimension Delta and Omega Novint Falcon Impulse engine (Immersion)

• 6+ Degrees of freedom – PHANTOM Premium 6 DOF, ForceDimension Delta 6 DOF, … – Teleoperator masters (MA-23, Argonne, CRL) – Freedom 6/7 (Hayward, MPB Technologies) – 6DOF (Cybernet) 32

Sensable Technologies: PHANTOM

http://www.sensable.com/

33

ForceDimension Omega and Delta

34

Novint Technologies: Falcon http://www.novint.com/

35

FCS Systems: HapticMASTER

36

3 DoF Force Feedback Joystick

Impulse Engine 2000

37

Medical Force Feedback Systems

38

Immersion CyberGrasp

39

What makes a good haptic interface?

• Must work with human abilities and limitations • Approximations of real-world haptic interactions determined by limits of human performance

40

A good haptic interface

• Free motion must feel free – Low back-drive inertia and friction – No motion constraints

• Ergonomics and comfort – Pain, discomfort and fatigue will detract from the experience

41

A good haptic interface (2)

• Suitable range, resolution and bandwidth – User should not be able to go through rigid objects by exceeding force range – No unintended vibrations – Solid objects must feel stiff

42

Haptic Rendering

Courtesy of Reachin Technologies AB, www.reachin.se

43

Haptic rendering • Haptic rendering is the process of computing and generating forces in response to use interactions with virtual objects, based on the position of the device. • Haptic rendering of an object can be seen as pushing the device out of the object whenever it moves inside it. • The human sense of touch is sensitive enough to require a processing speed of at least 1000 Hz in terms of haptic rendering. 44

Haptic rendering • The haptic rendering needs to provide forces that push the user out of the object. • The further inside the object you move, the greater the force pushing you out. • This makes the surface feel solid.

m

45

1000 Hz Performance Requirement

• The user becomes a part of the simulation loop. • 1000 Hz is necessary so that the whole system doesn’t suffer from disturbing oscillations. • The PHANTOM haptic devices run their control loop at 1000 Hz. • The consequence: we are very limited on the amount of computation that we can do.

46

1 kHz haptics rendering: update speeds “Real-time loop” (~1000 Hz) – Necessary due to the high sensitivity of human touch. – Not necessary to look at every object in the scene 1000 times per second.

“Scene-graph loop” (~60 Hz) – Looks at every object in the scene and generates surface instances that are rendered at 1000 Hz. 47

1 kHz haptics rendering



The real-time “surface” is a parametric surface.



This means that it can be curved to closely match the real surface curvature locally. The finger is the actual position of the haptic device

Force Finger

Scene-graph object



48

1 kHz haptics rendering Haptic position

T



S

The real-time “surface” has a 2D coordinate space to allow programmers to define haptic surface effects as a function of position and penetration depth. 49

1 kHz haptics rendering • 3-DOF haptic devices are rendered in the API using a spherical “proxy”. • The proxy stays on the surface of objects, and is maintained such that it is at the closest point on the surface of an object to the haptic device.

50

Device support • Reachin API currently supports: – SensAble PHANToM family – ForceDimension Delta and Omega haptic devices. – Immersion Laparoscopic Surgical Workstation and Impulse Engine. – Virtuose devices from Haption – Novint Falcon

• Devices are generally configured for each computer rather than for each application. • The same application can be used with different hardware settings without changing the code. 51

Degrees of Freedom Haptic rendering for pen-type devices: • 3DOF Haptics

Point-Object interaction

(3D force output)

• 6DOF Haptics

Object-Object interaction

(3D force + torque output) DOF = Degree of Freedom

52

Haptic interaction

53

3DOF Haptics: Introduction • Output: 3D force à 3DOF haptics • Limited to applications where point-object interaction is enough. – Haptic visualization of data – Painting and sculpting – Some medical applications Point-Object Point-object

Object-Object Object-object 54

6DOF Haptics: Introduction • Output: 3D force + 3D torque • For applications related to manipulation. – Assembly and maintenance oriented design. Removal of parts from complex structures.

• Typical problem: peg-in-the-hole.

There is a net torque

55

Haptic rendering • Two parts: collision detection, response

56

Two types of interactions • Point-based haptic interactions – Only end point of device, or haptic interface point (HIP), interacts with virtual object – When moved, collision detection algorithm checks to see if the end point is inside the virtual object – Depth calculated as distance between HIP and closest surface point

57

Two types of interactions (2)

• Ray-based haptic interactions – Probe of haptic device modeled as a linesegment whose orientation matters – Can touch multiple objects simultaneously – Torque interactions

58

Haptic texturing (3)

59

History • • • • • • •

Argonne ’54, first master-slave systems. Master=slave. Salisbury ’80, independent master and slave. Cartesian control of robot arm. Later, interaction with computer simulated slave (VR). GROPE project UNC ’67-’90, molecular docking. Minsky ’90, the Sandpaper. Massie & Salisbury ’94, the PHANTOM device. Early 90’s to ’97, 3DOF haptics. Late 90’s to today, 6DOF haptics.

60

References • • • • •



Basdogan, C., Srinivasan, M.A. “Haptic rendering in virtual environments.” http://network.ku.edu.tr/~cbasdogan/Papers/VRbookChapter.pdf Chen, E. “Six degree-of-freedom haptic system for desktop virtual prototyping applications.” Proc. First International Workshop on Virtual Reality and Prototyping, p. 97-106, 1999. Gregory, A., Lin, M. , Gottschalk, S. and Taylor, R. “A Framework for Fast and Accurate Collision Detection for Haptic Interaction.” Proc. of the IEEE Virtual Reality (VR 99), p. 38-45, 1999. Mark, W. et al. “Adding force feedback to graphics systems: issues and solutions.” Proc. ACM SIGGRAPH 1996. Massie, Thomas H. and Kenneth Salisbury. “The PHANTOM haptic interface: a device for probing virtual objects.” Proc ASME Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 1994. McNeely, W., Puterbaugh K., and Troy, J. “Six degree-of-freedom haptic rendering using voxel sampling.” Proc. ACM SIGGRAPH 1999.

61

References (2) • • • • • •

Ruspini, Kolarov and Khatib. “The haptic display of complex graphical environments.” Proc. ACM SIGGRAPH 1997. Salisbury, J.K. et al. “Haptic rendering: programming touch interaction with virtual objects.” Proc. ACM SIGGRAPH 1995. Salisbury, J.K. and Srinivasan, M.A. “Phantom-based haptic interaction with virtual objects.” IEEE Computer Graphics and Applications, 17(5), p. 6-10. Salisbury, J.K. “Making graphics physically tangible.” Communications of the ACM, 42(8), p. 74-81. Srinivasan, M.A. and Basdogan, C. “Haptics in virtual environments: taxonomy, research status, and challenges.” Computers & Graphics, 21(4), p. 393-404. Zilles, C.B. and Salisbury, J.K. “A constraint-based god-object method for haptic display.” Proc. IEE/RSJ International Conference on IntelligentRobots and Systems, Human Robot Interaction, and Cooperative Robots, Vol. 3, p. 146-151, 1995.

62

References R. J. Adams and B. Hannaford. A Two-port Framework for the Design of Unconditionally Stable Haptic Interfaces. In Proc. of IEEE/RSJ Int. Conference on Intelligent Robots and Systems, 1998. C. Basdogan, C. H. Ho and M. Srinivasan. A Ray-based Haptic Rendering Technique for Displaying Shape and Texture of 3D Objects in Virtual Environments. In the Proc. of ASME Dynamic Systems and Control Division, 1997. G. Burdea. Force and Touch Feedback for Virtual Reality. John Wiley and Sons, 1996. S. A. Ehmann and M. C. Lin. Accurate and Fast Proximity Queries between Polyhedra Using Surface Decomposition. In Proc. of Eurographics, 2001. A. Gregory, M. C. Lin, S. Gottschalk and R. M. Taylor II. H-Collide: A Framework for Fast and Accurate Collision Detection for Haptic Interaction. In Proc. of IEEE Virtual Reality Conference, 1999. A. Gregory, A. Mascarenhas, S. Ehmann, M. C. Lin and D. Manocha. Six Degree-of-Freedom Haptic Display of Polygonal Models. In Proc. of IEEE Visualization, 2000. 63

References V. Hayward and B. Armstrong. A New Computational Method of Friction Applied to Haptic Rendering. In Lecture Notes in Control and Information Sciences, Vol. 250, Springer-Verlag, 2000. Y. J. Kim, M. A. Otaduy, M. C. Lin and D. Manocha. Six-Degree-of-Freedom Haptic Display Using Localized Contact Computations. To appear in Proc. of Haptic Symposium, 2002. W. R. Mark, S. C. Randolph, M. Finch, J. M. Van Verth, and R. M. Taylor II. Adding Force Feedback to Graphics Systems: Issues and Solutions. In Proc. of ACM SIGGRAPH, 1996. W. A. McNeely, K. D. Puterbaugh and J. J. Troy. Six Degree-of-Freedom Haptic Rendering Using Voxel Sampling. In Proc. of ACM SIGGRAPH, 1999. D. C. Ruspini, K. Kolarov, and O. Khatib. The Haptic Display of Complex Graphical Environments. In Proc. of ACM SIGGRAPH, 1997. C. Zilles and K. Salisbury. A Constraint-based God Object Method for Haptic Display. In Proc. of IEEE/RSJ Int. Conference on Intelligent Robots and Systems, 1995. 64

This presentation is partly based on presentations by the following people: – Pierre Boulanger, Department of Computing Science, University of Alberta – Max Smolens, University of North Carolina at Chapel Hill – Ming C. Lin, University of North Carolina at Chapel Hill – Ida Olofsson, Reachin Technologies AB

65