Exploring Gesture Control with Depth Sensing ...

3 downloads 189772 Views 1MB Size Report
With more and more VR headsets such as the Oculus Rift[1], HTC Vive[2], PlayStation VR[3] and Samsung Gear VR[4] released, VR gaming platforms have become quite ... smartphone and the leap motion controller connected to a laptop.
Exploring Gesture Control with Depth Sensing Camera in Virtual Reality

Submitted by Yin Zhang M. Sc. Candidate Under The Guidance of Dr. Oscar Meruvia-Pastor, Assistant Professor, Department of Computer Science

Acknowledgements I would like to thank my project guide Dr. Oscar Meruvia for his valuable guidance, suggestions and timely help in the completion of this project. I would also lie to extend my sincere gratitude to Dr. Ed Brown for his support in providing equipment that was instrumental for the successful completion of this project. Last but not the least I want to express regards and deep wishes to friends, such as Shiyao Wang and Amit Prabhakar Desai, who provided a helping handing through direct and indirect co-operation or suggestions.

Abstract Immersive Virtual Reality (VR) has become more affordable with the appearance of VR headsets such as the Oculus Rift, Google Cardboard or the Samsung Gear. In addition, portable depth-sensing cameras allow users to control interfaces using hand gestures at a short range from the camera. Both technologies are being put together to create immersive VR experiences that respond to hand gestures. The goal of this project is to explore gesture-based interaction in immersive VR environments using Unity, Leap Motion, a Smartphone and the FreeFly VR headset. We implemented a system that allows users to play a game in a virtual world with the ultimate objective to explore improvements to this type of system.

I.

Introduction

With more and more VR headsets such as the Oculus Rift[1], HTC Vive[2], PlayStation VR[3] and Samsung Gear VR[4] released, VR gaming platforms have become quite popular these days. However, those headsets mentioned above aren’t the first to market, by a long way. Back in 1995, the Virtual Boy[5] released by Nintendo turned out to be the most outstanding example of the first wave of commercial virtual reality devices. (Figure 1) The failure of the 90’s wave of virtual reality was so complete that it killed the whole field of that generation.

Figure 1. A clunky, desk-mounted device with low-resolution and non-real time image refreshing and rendering

The most recent rising of Virtual Reality owes the credit to Palmer Luckey [6] who is the founder of Oculus, his Rift headsets set off the new wave of interest in Virtual Reality which continues to today. In the meantime, Steve Jobs [7] popularized the modern smartphone with high-resolution screens, accurate motion sensors and compact form factors that are similar to the technology required to make convincing VR.

As Google Cardboard [8] and the Samsung Gear VR first illustrated in 2015, a low-cost, affordable and more casual form of VR combing with smartphones came into the public sight. By sliding a smartphone into a head-mounted display, users can get access into relatively simple virtual worlds. There is another significant leap forward in the smartphone hardware industry in 2016: depth sensing cameras. The depth sensor makes a smartphone aware of its immediate environment as well as helps identify a user’s body in the environment. But there are some barriers to implement this major change: cost, convenience and adoption.

The motivation of this project is to explore VR gesture control with a low-cost combination of Samsung phone, Leap Motion and VR headsets, discover implementation challenges and

propose solutions to overcome the major barriers along the way. We are mainly interested in discovering the way for users to have easy access to control a simple VR game with their own hand motions, using the Unity engine.

II.

Background and Related Work

With the release of new VR headsets such as Oculus Rift DK2, Oculus Rift CV1 and HTC Vive, gesture control with Leap Motion in Oculus is widely used in the VR gaming scene (Figure 5). The Unity game engine provides Asset for open source developing using Oculus Rift and Leap Motion Orion on Windows 7-10 system. A developer can implement 360° gaming view, and real-time head motion tracking through Oculus rift motion sensor and hand or finger gesture recognition through leap motion. Even with appealing and mature developing supports, this development has strict requirements on devices and operation systems as well as Unity engine version, which greatly limits the free play space on game developing for developers. As new comer in the gaming market, setting up barrier stops most developers’ further steps in this area.

Affordable and convenience are still the major factors that affect new developers. A barrier to development of new solutions is the price and availability of the newest VR platforms. In this project, we put together a low-cost development environment that emulates the experience provided by devices such as the Oculus Rift and the HTC Vive by putting together a generic VR headset with an off-the-shelf smartphone and the leap motion controller connected to a laptop that executes the whole immersive gaming experience.

Figure 5. Oculus Rift mounted with Leap Motion.

Before the demonstration of my implementation, there are some system components that need to be described in advance. 1. Leap Motion[9] Leap Motion is a computer hardware sensor device that support hand and finger motions as input, requiring no manual contact with the controller, and working as a replacement of a standard desktop mouse.

The Leap Motion controller is originally designed to be placed on a physical desktop, facing upward (Figure 2), but it can also be mounted on VR headset as gesture recognition device[10]. The device uses three LED emitters to illuminate the surrounding space with IR light, which is reflected back from the nearby objects and captured by two IR cameras. The device’s software installed on the connected PC will analyze the captured images in real-time, determine the positions of objects, and perform the recognition of user’s hands and fingers.

Figure 2. Leap Motion Controller and the coordinate system used to describe positions in its sensory space.

A study on the Controller’s performance [11] revealed that the device’s FOV is an inverted pyramid centered on the device. The effective range of the controller extends from approximately 3 to 30 cm above the device (y-axis), approximately 0 to 20 cm behind the device (negative z-axis) and 20 cm in each direction along the device (x-axis). The standard deviation of the measured position of a static object was shown to be less than 0.5 mm. (Figure 3)

Figure 3. Leap Motion controller FOV

The smaller observation area and higher resolution of the device differentiates the product from the Kinect, which is more suitable for whole-body tracking in a space the size of a living room.

2. Free Fly VR& Samsung Phone [12] In this project, the Free Fly VR headset and Samsung S5 are used as Virtual Reality Visualization devices in the Game implementation phase. (Figure 4)The Free Fly VR is a headworn mobile VR headset that is compatible with a wide range of phone lengths from 135mm to the size of 165×90mm. Free Fly VR tracks users’ 360° head movements and renders 3D visuals on the phone screen, providing an immersive VR experience. The Free fly has a lightweight design, with a total weight as 0.7 kg and 120° FOV and fully blacked-out contours. Its unique design provides wearable and easy-adjustment and setup for a variety of phones. In our case, we used a Samsung Galaxy S5.

Figure 4. Free fly VR head set.

3. Unity 5.3[13] The latest version of Unity engine: Unity 5.3 is used in the project as the game developing platform. The Unity 5.3 engine contains the most improved built-in support for VR development, providing a robust and fully featured solution for the creation of immersive VR content and the newest depth sensing assets, like the Leap Motion Core Asset, which allowed us to have a quick implementation of the gaming environment for the project.

III.

Project Implementation 1. Project Setup

As for the Software part, this project is implemented in Unity 5.3 on OS X El Capitan system. The OS X system is equipped with Leap Motion SDK_Mac_2.3.1, Splashtop Streamer [14], as well as LeapMotion Core Asset_2_3_1[15] package installed in the Unity engine. Meanwhile, the Samsung S5 phone is having the Free Fly VR SDK and Splashtop Streamer installed.

The Leap Motion SDK connects the Leap Motion controller to the PC system, allowing the captured data from the Leap motion to be delivered to the analyzing software on the OS X system in real-time. The LeapMotion Core Asset package, which is imported into the Unity engine, will function as the connector between the Unity engine and the Leap Motion analyzing software, which will translate the input data from the analyzing software as control signal and generate the corresponding hand model and gestures. The Splashtop Streamer mirrors the PC screen onto any mobile devices with Splashstop Customer installed and in the same network.(Figure. 6)

Figure 6 Software Setup flow chart.

As for the Hardware Part, Leap Motion is always connected to the PC through the USB port, whereas the Samsung phone is wireless mirroring the PC screen. Then Samsung Phone is slid

into the Free Fly VR headset and calibrated for horizontal and vertical axis with the help of the Free Fly VR SDK. In this project, there are two conditions with regards to the location of the Leap motion controller, either it is horizontally placed on the desk or it is mounted on top of the VR headset facing outside towards the PC screen. (Figure 7)

Figure 7. Hardware setup.

2. Applications Developed 2.1 Dial Panel In order to explore the experience of point touch accuracy with Leap Motion gesture control, we simulated the dial panel with 9 number buttons and two symbol buttons: # and *.(Figure 8)

Figure 8. Dial Panel.

box collider component on each button allows a button be able to react to physical collision with the generated hand model. Multiple canvas layers on each button work under the button color changing script, allowing it to change color when it is picked by the finger model as a clear visual feedback. (Figure 9)

Figure 9 Button Canvas

With a panel plane beneath the buttons layer, a shadow is allowed to be casted on to the plane along with the hand motion, which provides a direct sense of the distance between the hand and the panel. The button’s color will retain changed on the first pick, and only change back on a second pick.

2.2 Tetris Game In this section, I will introduce the well-known 2D Tetris game [16]implemented under VR environment. A random Tetris block will be generated on the top center of the game area. (Figure 10,11).

Figure 10,11. Block groups & Spawner the first random group

For implementing the rest of the game-play features, we need the following helper functions: •

Check if all the blocks are between the borders



Check if all the blocks are above y=0



Check if a group can be moved towards a certain position



Check if a row is full of blocks



Delete a row



Decrease a row's y coordinate

For the implementation of the game control, we built Game control engine: •

Block horizontal move and rotate



Block fall speed up



Clear filled horizontal line



Spawn next group

The game control functions implemented above will be triggered by the physical collision between the control panel and the hand model instead of using the tradition keyboard keys. (Figure 12)

Figure 12. Control panel

2.3 Comparison between head-mounted and non-head-mounted Leap Motion positions Both the implementation of the dial panel and the Tetris game are compared under headmounted and table-placed conditions. The leap motion is horizontally placed on the desk in the table-placed condition and it is mounted on to the top of the VR headset facing outside towards the PC screen in the head-mounted situation. (Figure 13)

Figure 13. Leap Motion positons comparison

3. Results and Evaluation After the game and simulation implemented, I need to add the left and right eye cameras to fit in the Free Fly VR headset. (Figure 14)

Figure 15. Camera Settings.

In the implementation of the dial panel simulation, we obtained a stable result, where users can predict the placement of the hand using the shadow effect and easily choose the button using the output information provided by the visual feedback. (Figure 16)

Figure 16. Result of Dial Panel Simulation

During the implementation process, we found out that the hand model generated can be misplaced, in that it didn’t reflect the real world’s hand position, which increased the possibility of missing the collision with the buttons. Here, we modified the collision mesh to be limited to

the index finger, so in the application, only the index finger can trigger the physical collision. The video demo can be found here. (Video demo Link: https://youtu.be/7xu-lR_j1T8)

As for the implementation of the Tetris game, the control panel can be really tricky to operate during the playing phase. As a result of various evaluation sessions with my supervisor Dr. Oscar Meruvia, we figured out several modifications on the gaming control panel regarding the size, transparency, shadow effects and the visual feedback on collision, with the goal of improving the user experience. Eventually, the panel no longer blocked the vision and became reachable and controllable, still it was still quite distractive when trying to control the blocks. Overall, the control panel did not feel as comfortable as the keyboard, from the point of view of the gaming experience. (Figure 17)(Video demo Link: https://youtu.be/1Ndsxx75Wdw)

Figure 17. Result of Tetris Game

As for the comparisons of the head-mounted versus table-placed conditions for the location of the Leap Motion controller, the head-mounted version more close resembles the original setup of the Oculus rift combined with the Leap Motion, which gave out a better immersive VR experience. Also, placing the Leap Motion core asset within the Unity environment was more challenging, and so the hand gesture accuracy was not as stable as in the table-placed configuration.

IV.

Conclusions and Future Work IV.1 Conclusions

Through the implementation of this whole project, we found out that the stable tracking and the reachable and controllable panel were the most challenging aspects of this work. The built-in leap motion analyzing system has its own limitations in real world model reflection, but it is still workable for real world gaming environment simulations. Also, the tracking stability can be improved by leap motion pre-calibration. Still, the current version of Leap Motion SDK give out lower tracking quality when making fist or single finger extended[13].The detecting stability can vary on different gaming implementation.

Virtual Reality can bring the user to experience a more immersive gaming experience, but it requires more support from the hardware. The low-cost Samsung phone working as a VR display, with the support from a simple and light weighted VR headset constitutes a more affordable method for VR development for people who want to try the technology in these early stages of development. With no significant barriers to the implementation of a testing platform, any Unity developed game can be mirrored to a simple mobile device.

IV.2 Future Work While testing the project prototype with my supervisor, Prof. Oscar Meruvia, we noticed that a more intuitive approach to play the Tetris game could be to replace the button-based control panel with a hand waving controller, which seemed a more natural approach for the Tetris game in particular. Right before the finish of the project I implemented this possibility under the Unity physical engine. Due to time constraint, the physical engine didn’t work well with the 2D game script, however my early attempts at this other type of controller seemed encouraging in that it could improve the gaming experience by providing a more natural type of controller in this case. Also the best way to test a implementation would be by means of a user study to validate user performance and the degree of satisfaction using different controllers as the depth sensing devices and different using condition.

V.

References

[1] "Oculus". Oculus.com. N.p., 2016. Web. 19 Apr. 2016. [2] "Vive | Home". Htcvive.com. N.p., 2016. Web. 19 Apr. 2016. [3] "Playstation VR". Playstation. N.p., 2016. Web. 19 Apr. 2016. [4] "Samsung Gear VR - The Official Samsung Galaxy Site". The Official Samsung Galaxy Site. N.p., 2016. Web. 19 Apr. 2016. [5] "Unraveling The Enigma Of Nintendo's Virtual Boy, 20 Years Later". Fast Company. N.p., 2015. Web. 19 Apr. 2016. [6] Clark, Taylor. "How Palmer Luckey Created Oculus Rift". Smithsonian. N.p., 2014. Web. 19 Apr. 2016. [7] Whitwam, Ryan. "How Steve Jobs Killed The Stylus And Made Smartphones Usable | Extremetech". ExtremeTech. N.p., 2016. Web. 19 Apr. 2016. [8] "Google Cardboard – Google". Google.com. N.p., 2016. Web. 19 Apr. 2016. [9] "Get Started | Leap Motion Developers". Developer.leapmotion.com. N.p., 2016. Web. 14 Apr. 2016. [10] Bansal, Bharti. "Gesture Recognition: A Survey". International Journal of Computer Applications 139.2 (2016): 8-10. Web. [11] Silva, Édimo Sousa, and Maria Andréia Formico Rodrigues. "Design and Evaluation of a Gesture-Controlled System for Interactive Manipulation of Medical Images and 3D Models." SBC Journal on Interactive Systems 5.3 (2014): 53-65. [12] "Freefly VR How Does It Work, The Complete Guide". FreeflyVR. N.p., 2016. Web. 14 Apr. 2016.

[13] "Unity - What's New In Unity 5.3". Unity3d.com. N.p., 2016. Web. 19 Apr. 2016. [14] "Splashtop - Top-Performing Remote Desktop And Remote Support". Splashtop Inc.. N.p., 2016. Web. 19 Apr. 2016. [15] "Unity | Leap Motion Developers". Developer.leapmotion.com. N.p., 2016. Web. 19 Apr. 2016. [16] "Noobtuts - Unity 2D Tetris Tutorial". Noobtuts.com. N.p., 2016. Web. 19 Apr. 2016. [17] Hern, Alex. "Will 2016 Be The Year Virtual Reality Gaming Takes Off?". the Guardian. N.p., 2015. Web. 19 Apr. 2016. [18] Adhikarla, Vamsi et al. "Exploring Direct 3D Interaction For Full Horizontal Parallax Light Field Displays Using Leap Motion Controller". Sensors 15.4 (2015): 8642-8663. Web. [19] Guna, Jože et al. "An Analysis Of The Precision And Reliability Of The Leap Motion Sensor And Its Suitability For Static And Dynamic Tracking". Sensors 14.2 (2014): 3702-3720. Web. [20] Higuchi, Masakazu, and Takashi Komuro. "[Paper] Robust Finger Tracking For Gesture Control Of Mobile Devices Using Contour And Interior Information Of A Finger". MTA 1.3 (2013): 226-236. Web. [21] Freeman, William T. "Hand gesture control system." U.S. Patent No. 6,002,808. 14 Dec. 1999. [22] Welch, Greg, and Gary Bishop. "SCAAT: Incremental tracking with incomplete information." Proceedings of the 24th annual conference on Computer graphics and interactive techniques. ACM Press/Addison-Wesley Publishing Co., 1997. [23] Arthur, Kevin Wayne. Effects of field of view on performance with head-mounted displays. Diss. University of North Carolina at Chapel Hill, 2000.

Suggest Documents