The well-known AR software, Unity3D which supports windows and android has been used throughout this project. Ease of operation is one of the key features ...
University of Applied Science Ravensburg-Weingarten
Scientific Project
Visualization of industrial robot with Augmented Reality
Presented by Nayan Kadam Bharath Chandra Reddy Vadde
28725 29183
Master Mechatronics Master Mechatronics
Supervised by Prof. Dr. K. Wöllhaf Duration: 03.05.2017 – 31.09.2017
Abstract While robots are getting increasingly more capable and autonomous, they may still require our help to carry out tasks in real-world environments. That’s why we continue to look for new ways to allow human operators to control robots without extensive training. The Augmented Reality is one of the best ways to visualize industrial robot. The Augmented Reality adds information and meaning to a real object or place. Unlike virtual reality, augmented reality does not create a simulated reality. Instead, it takes a real object or space and uses technologies to add contextual data to deepen understanding of it. This project visualizes any industrial robot in augmented form. This project describes the designing and control of augmented model of KUKA robot. It gives a brief explanation of available platforms such as windows, android, iOS on which augmented model of robot can be illustrated successfully. There are many available platforms for Augmented Reality. We have chosen Vuforia which is a free platform enables the creation of Augmented Reality applications and allow user to visualize objects virtually. The well-known AR software, Unity3D which supports windows and android has been used throughout this project. Ease of operation is one of the key features of our augmented reality product, but another advantage that it allows user to control the augmented model of KUKA robot. We designed an Android based App which acts as a controller to the virtual model of six axes KUKA robot. The App shall be available on App-Store in the upcoming months. This is a beginning of augmented reality era and first step to control virtual model of any industrial robot. We are hoping to map a virtual robot with real industrial robot in the future prospect.
i
Contents Abstract .....................................................................................................................................................i List of Figures .......................................................................................................................................... iii 1.
2.
Introduction .................................................................................................................................... 1 1.1.
Goal of the project............................................................................................................. 1
1.2.
Structure of the work…………………………………………………………………………………………………….2
Augmented Reality types and working principle………………………………………………………….……………..3 2.1.
Augmented Reality types……………………………………………………………………………………………….3
2.2.
How does AR works……………………………………………………………………………………………………….5
2.3.
Augmented Reality Softwares ........................................................................................... 7 2.3.1.
Open source softwares ……………………………………………………………………………………7
2.3.2.
Augmented Development Toolkits ....................................................................... 8
2.3.3.
Importants of Unity 3D over other AR Toolkits .................................................... 9
3.
Vuforia Tools Platform Main Components ................................................................................... 10
4.
Getting Started with Vuforia ........................................................................................................ 11 4.1.
How To Install the Vuforia Unity Extension ..................................................................... 11
4.2.
How To Setup a Simple Unity Project: ............................................................................. 11
4.3.
5.
4.2.1.
Create a project .................................................................................................. 12
4.2.2.
Obtain a License Key and creating target ........................................................... 12
4.2.3.
Adding Targets .................................................................................................... 13
4.2.4.
Inside Unity3D ..................................................................................................... 13
Deploy the Application. ................................................................................................... 16 4.3.1.
Android development process ........................................................................... 16
5.3.2.
iOS development process ................................................................................... 17
Main Project: Augmented KUKA Robot ........................................................................................ 18 5.1.
Stage 1. Pocedure for Vuforia and Unity3D ..................................................................... 18
5.2.
Stage 2. Procedure for VRML model in Blender and creating .fbx file ............................ 20
5.3.
Stage 3. Program for augmented KUKA robot ................................................................. 23
5.4.
Stage 4. Creating Controller (Buttons) for augmented KUKA robot. ............................... 31
5.5.
Stage 5. Android App development……………………………………………………………………………..32
6.
Applications of AR in industrial robotics .................................................................................... 36
7.
Mapping virtual robot with real industrial robot ....................................................................... 37
8.
Final Augmented model of KUKA robot ..................................................................................... 40
References............................................................................................................................................. 41
ii
List of Figures Figure 2.1: Marker Based Augmented Reality, Ref. ARreverie(AR Website) ....................... 3 Figure 2.2: Marker less Augmented Reality, Ref. ARreverie(AR Website) ......................... 3 Figure 2.3: Projection Based Augmented Reality, Ref. ARreverie(AR Website) ............... 4 Figure 2.4: Superimposition Based AR, Ref. ARreverie(AR Website) ................................. 4 Figure 4.1: Unity3D screen shot of project section, Ref. Vuforia Developer portal ............. 11 Figure 4.2: Platform Anatomy, Ref. Vuforia Developer portal ............................................ 12 Figure 4.3: Unity3D Project section, Ref. Vuforia Developer Portal .................................. 13 Figure 4.4: Unity3D Inspector section, Ref. Vuforia developer portal ................................ 14 Figure 4.5: Vuforia configuration, Ref. Vuforia developer portal………………………….16 Figure 4.6: Output of simple AR project, Ref. Vuforia Developer Portal ........................... 17 Figure 5.1: Create License Key, Ref. Vuforia License manager .......................................... 18 Figure 5.2: Augmented Visualization of Robot, Ref. Screen shot from mobile App………19 Figure 5.3: VRML KUKA Robot model in Blender, Ref. Screen shot of model in Unity3D ....................................................................................................................... 20 Figure 5.4: Rigging process of KUKA robot, Ref. Screen shot of model in Unity3D ........ 20 Figure 5.5: Limiting rotations of axes of robot, Ref. Screen shot of model in unity ........... 21 Figure 5.6: Final rigging of KUKA robot and parenting object to the Bone, Ref. Screen shot of model in Unity3D .................................................................................................... 21 Figure 5.7: Final model of KUKA robot after rigging, Ref. Screen shot of model in Unity 22 Figure 5.8: Visualization of KUKA robot without controls, Ref. Mobile screen shot ......... 22 Figure 5.9: Design of Buttons for controller, Ref. Screen shot in Unity3D......................... 31 Figure 5.10: Final design of controller for augmented KUKA robot, Ref. Screen shot in Unity3D ....................................................................................................................... 32 Figure 5.11: Installation of Android SDK tool, Ref. Unity3D tutorial ................................ 33 Figure 5.12: Build setting window, Ref. Unity3D tutorial................................................... 34 Figure 5.13: Identification Bundle Identifier, Ref. Unity 3D tutorial .................................. 34 Figure 5.14: External Tools, Ref. Unity3D Tutorial ............................................................ 35 Figure 5.15: Build setting window, Ref. Unity3D tutorial................................................... 35 Figure 6.1: AR in manufacturing, Ref. 48th CIRP Conference on manufacturing .............. 36 Figure 6.2: ABB‟s holographic industrial robot, Ref. Blog on LinkedIn by Anna Vondrackova ................................................................................................................ 36 Figure 7.1: Positioning fiducial markers, Ref. IEEE Spectrum ........................................... 37 Figure 7.2: Virtual View of model, Ref. IEEE Spectrum................................................... ..37 Figure 7.3: Pick and place augmented operation, Ref. IEEE Spectrum .............................. 38 Figure 7.4: Pick and place industrial robot operation, Ref. IEEE Spectrum ....................... 38
iii
Figure 7.5: Mapping of Augmented and real industrial robot, Ref. IEEE Spectrum ........... 39 Figure 4.3: Augmented visualization and control of KUKA robot in App, Ref. Mobile screen shots ................................................................................................................. 40
iv
1. Introduction The Augmented reality (AR) is a live direct or indirect view of a physical, real-world environment whose elements are "augmented" by computer-generated or extracted real-world sensory input such as sound, video, graphics or GPS data. It is related to a more general concept called computer-mediated reality, in which a view of reality is modified by a computer. There are several techniques available to control industrial robot such as industrial robots controllers, computer programs, mobile controllers etc. The Augmented Reality can be useful to operate the robot with augmented visualization and virtual experience. Thirty-four percent of top games are made with Unity and Unity touches 770 million gamers all over the world through games made using this engine. Vuforia is an Augmented Reality Software Development Kit (SDK) for mobile devices that enables the creation of Augmented Reality applications. It uses Computer Vision technology to recognize and track Image Targets and simple 3D objects. This image registration capability enables developers to position and orient virtual objects, such as 3D models and other media, in relation to real world images when these are viewed through the camera of a mobile device. The virtual object then tracks the position and orientation of the image in real-time so that the viewer‟s perspective on the object corresponds with their perspective on the Image Target, so that it appears that the virtual object is a part of the real world scene. This two development platform works dependently and gives amazing virtual experience. The idea behind this project was to make use of these two platforms and implement visualization of any industrial robot that can be operated by android smart phone.
1.1. Goal of the project The main tasks in project are as follows:
Demonstration of Augmented Reality toolkit, operated on both android and windows platform
Integration of 3D object: Showing KUKA robot in as an 3D form
Controlling movement of 6 axes of KUKA robot Visualization of KUKA robot in Augmented reality Design a controller as an Android App to control augmented KUKA robot
1
1.2. Structure of work This project has divided into basically four parts. The first part of the projects gives overall idea about augmented reality and software‟s that can be useful for virtual demonstration of the object. It also explains how to deal with Vuforia & Unity3D and advantages of Unity3D over other AR toolkits. The second part demonstrates how one can develop augmented model with a simple example. It also gives basic idea about creating a project, adding target and license keys. The android and iOS platforms are explained in second part. Then third and most important part of this project gives step by step explanation of designing and visualization of augmented model of KUKA robot. It also gives guidelines on programming and App development. This project ended by industrial use of augmented reality and how one can map augmented model of robot with real industrial robot.
2
2. What is Augmented Reality The origin of the word augmented is augment, which means to add something. In the case of augmented reality (also called AR), graphics, sounds, and touch feedback are added into our natural world. Unlike virtual reality, which requires you to inhabit an entirely virtual environment, augmented reality uses your existing natural environment and simply overlays virtual information on top of it. As both virtual and real worlds harmoniously coexist, users of augmented reality experience a new and improved world where virtual information is used as a tool to provide assistance in everyday activities. AR defined as, an enhanced version of reality where live direct or indirect views of physical real-world environments are augmented with superimposed computer-generated images over a user's view of the real-world, thus enhancing one’s current perception of reality.
2.1. Types of Augmented Reality
Marker Based Augmented Reality: Marker-based augmented reality (also called Image Recognition) uses a camera and some type
of visual marker, such as a QR/2D code, to produce a result only when the marker is sensed by a reader. Marker based applications use a camera on the device to distinguish a marker from any Fig. 2.1 Ref. ARreverie (AR online learning) other real world object. Distinct, but simple patterns (such as a QR code) are used as the markers, because they can be easily portal) recognized and do not require a lot of processing power to read. The position and orientation is also calculated, in which some type of content and/or information is then overlaid the marker. [4]
Marker less Augmented Reality: As one of the most widely implemented applications of augmented reality, marker less (also called location-based, position-based, or GPS) augmented reality, uses a GPS, digital compass, velocity meter, or accelerometer which is embedded in the device to provide data based
Fig. 2.2 Ref. ARreverie (AR online learning)
3
on your location. A strong force behind marker less augmented reality technology is the wide availability of smartphones and location detection features they provide. It is most commonly used for mapping directions, finding nearby businesses, and other location-centric mobile applications. [4]
Projection Based Augmented Reality: Projection based augmented reality works by projecting artificial light onto real world surfaces. Projection based augmented reality applications allow for human interaction by sending light onto
portal)
a real world surface and then sensing the human interaction (i.e. touch) of that projected light. Fig. 2.3 Ref. ARreverie (AR online learning) Detecting the user‟s interaction is done by differentiating between an expected (or known) projection and the altered projection (caused by the user's interaction). Another interesting application of projection based augmented reality utilizes laser plasma technology to project a three-dimensional (3D) interactive hologram into mid-air. [4]
Superimposition Based Augmented Reality:
Superimposition based augmented reality either partially or fully replaces the original view of an object with a newly augmented view of that same object. In superimposition based augmented reality, object recognition plays a vital role because the application cannot replace the original view with an augmented one if it cannot Fig. 2.4 Ref. ARreverie (AR online learning) determine what the object is. A strong portal) consumer-facing example of superimposition based augmented reality could be found in the Ikea augmented reality furniture catalogue. By downloading an app and scanning selected pages in their printed or digital catalogue, users can place virtual idea furniture in their own home with the help of augmented reality. [4]
4
2.2. How does Augmented Reality works In order to understand how augmented reality technology works, one must first understand its objective: to bring computer generated objects into the real world, which only the user can see. In most augmented reality applications, a user will see both synthetic and natural light. This is done by overlaying projected images on top of a pair of see-through goggles or glasses, which allow the images and interactive virtual objects to layer on top of the user's view of the real world. Augmented Reality devices are often self-contained, meaning that unlike the Oculus Rift or HTC Vive VR headsets, they are completely untethered and do not need a cable or desktop computer to function. Key Components to Augmented Reality Devices: 1. Sensors and Cameras: Sensors are usually on the outside of the augmented reality device, and gather a user's real world interactions and communicate them to be processed and interpreted. Cameras are also located on the outside of the device, and visually scan to collect data about the surrounding area. The devices take this information, which often determines where surrounding physical objects are located, and then formulates a digital model to determine appropriate output. In the case of Microsoft Hololens, specific cameras perform specific duties, such as depth sensing. Depth sensing cameras work in tandem with two "environment understanding cameras" on each side of the device. Another common type of camera is a standard several megapixel camera (similar to the ones used in smartphones) to record pictures, videos, and sometimes information to assist with augmentation. 2. Projection: While “Projection Based Augmented Reality” is a category in-itself, we are specifically referring to a miniature projector often found in a forward and outward-facing position on wearable augmented reality headsets. The projector can essentially turn any surface into an interactive environment. As mentioned above, the information taken in by the cameras used to examine the surrounding world, is processed and then projected onto a surface in front of 5
the user; which could be a wrist, a wall, or even another person. The use of projection in augmented reality devices means that screen real estate will eventually become a lesser important component. In the future, you may not need an iPad to play an online game of chess because you will be able to play it on the tabletop in front of you. [7] 3.
Processing: Augmented reality devices are basically mini-supercomputers packed into tiny wearable devices. These devices require significant computer processing power and utilize many of the same components that our smartphones do. These components include a CPU, a GPU, flash memory, RAM, Bluetooth/Wifi microchip, global positioning system (GPS) microchip, and more. Advanced augmented reality devices, such as the Microsoft Hololens utilize an accelerometer (to measure the speed in which your head is moving), a gyroscope (to measure the tilt and orientation of your head), and a magnetometer (to function as a compass and figure out which direction your head is pointing) to provide for truly immersive experience.
4. Reflection Mirrors are used in augmented reality devices to assist with the way your eye views the virtual image. Some augmented reality devices may have “an array of many small curved mirrors” (as with the Magic Leap augmented reality device) and others may have a simple double-sided mirror with one surface reflecting incoming light to a side-mounted camera and the other surface reflecting light from a side-mounted display to the user's eye. In the Microsoft Hololens, the use of “mirrors” involves see-through holographic lenses (Microsoft refers to them as waveguides) that use an optical projection system to beam holograms into your eyes. A so-called light engine, emits the light towards two separate lenses (one for each eye), which consists of three layers of glass of three different primary colors (blue, green, red). The light hits those layers and then enters the eye at specific angles, intensities and colors, producing a final holistic image on the eye's retina. Regardless of method, all of these reflection paths have the same objective, which is to assist with image alignment to the user's eye.
6
2.3. Augmented Reality Software’s 2.3.1. Open Source: [6] 1. Argon: Augmented reality browser by Georgia Tech's GVU Center that uses a mix of KML and HTML/JavaScript/CSS to allow developing AR applications; any web content (with appropriate meta-data and properly formatted) can be converted into AR content. 2. ARToolKit: An open source (LGPLv3) C-library to create augmented reality applications; was ported to many different languages and platforms like Android, Flash or Silverlight. 3. ArUco: A minimal library for augmented reality applications based on OpenCV; licenses: BSD, Linux, Windows.Augment, an augmented reality platform for tablets and smartphones. 4. ATOMIC Authoring Tool: A multi-platform authoring for creating AR applications on Microsoft Windows, Linux and Mac OS X operating systems. 5. Goblin XNA: A platform for researching 3D user interfaces, including mobile augmented reality and virtual reality, emphasizing games; written in C#, based on Microsoft XNA Game Studio 4.0, BSD license. 6. GRATF: Open-source (GPLv3) project, which includes C# library for detection, recognition and 3D pose estimation of optical glyphs. The project includes application, which does 2D and 3D augmented reality. 7. Mixare (mix Augmented Reality Engine): Open-source (GPLv3) augmented reality engine for Android and iPhone; works as an autonomous application and for developing other implementations. 8. DroidAR: Open source (dual-license: GPLv3 or commercial) augmented reality framework for Android, featuring location-based and marker based AR. 9. GeoAR: Open source (Apache 2.0 License) browser for Android, featuring location-based AR and a flexible data source framework. 10. BeyondAR: Open source (Apache 2.0 License) augmented reality framework based on geo localization for Android. 11.
Mangan: Open source (Mango License) augmented reality framework based on Nano localization for Android.
7
2.3.2. Augmented Development Toolkits [6] 1.
Unity3D: Unity is the ultimate game development platform. Unity to build high-quality 3D and 2D games, deploy them across mobile, desktop, VR/AR.
2.
Layar SDK: It is an augmented reality SDK for iOS and Android apps.
3.
Catchoom CraftAR AR SDK: It is an iOS and Android SDK that renders Augmented Reality experiences with plugins for Cordova and Unity.
4.
Vuforia: Augmented Reality SDK, formerly known as QCAR, is a Software Development Kit for creating augmented reality applications for mobile devices.
5.
Wikitude SDK: is an augmented reality SDK for mobile platforms originated from the works on the Wikitude World Browser app by Wikitude GmbH. The Wikitude SDK was the first AR SDK providing a JavaScript API to work with augmented reality experiences.
6.
EasyAR is a free and easy to use alternative to Vuforia. Supported platforms: Android, iOS, UWP, Windows, Mac and Unity Editor. The latest version of EasyAR (1.3.1) supports the image recognition only. Version 2.0 will include the following features: 3D Object Recognition, Environment perception, Cloud Recognition, Smart Glass Solution.
7.
8.
Wikitude: Supported platforms: Android, iOS, Smart Glasses. Recently Wikitude released the latest version of powerful SLAM solution for augmented reality apps Wikitude SDK 6. Wikitude SDK 6 implements the following functionalities: Image recognition & tracking(combines top-notch image recognition and tracking), 3D tracking technology (SLAM-based), GEO Data (improved working with geo-referenced data), Cloud recognition, Improved Extended Tracking, Positioning. Kudan: According to reviews and comparisons of efficiency, Kudan is the main rival of Vuforia and make augmented reality development very easy. Supported platforms: Android, iOS. Using the SLAM technology Kudan allows to recognize the simple images and 3D objects and provides easy generation of the database in the Unity Editor.
8
2.3.3.
Importance of Unity 3D over other AR Toolkits
1. It’s FREE to Get Started with Unity: Unity3D comes with a free version and a Pro version, but unlike most software with both payment options, Unity3D‟s free version is feature-complete. 2. It’s Multi-Platform: iOS, Android, Windows Phones, Macs, PCs, Steam, PlayStation, Xbox, Wii etc. 3. The Thriving and Supportive Community: 2 million+ developers using the Unity software, it‟s great to have multiple online resources to share the love and frustrations of the program with. If you ever get stuck on a developing issue, want to chat with like-minded people, or are even looking for an artist or developer to collaborate with on your next big idea, there are tons of forums out there where eager Unity fans unite. And speaking of Unite, there‟s the annual conference that Unity puts on (Unite), where you can meet your online Unity buddies in person in either Europe or North America each summer. 4. The Asset Store: The Unity Asset Store is a great place to a) what you need for your game without making it from scratch (a character, a building etc. or b) a nice place to make a little extra revenue if you‟re an artist, musician, or modeler. There is a submission process you must go through in order to sell your assets in the Unity store, but once you‟re approved, you‟ll get 70% royalties on each purchase which can be a fantastic way to fund your next game. 5. The Ability to Create Multiplayer Games: Some of the biggest multiplayer games on the web and mobile are built with Unity (Marvel Superhero Squad, Solstice Arena). Building a multiplayer game is a massive under-taking, and with the set of tools Unity provide and the support of the community, we are able to create our multiplayer game, My Giants, the way we wanted – a task that would have been impossible without it. 6. Online Tutorials/ Classes make it Easy to Learn: The really beautiful thing about Unity is how easy it is to learn. Sure, there‟s a bit of a learning curve in the beginning, but considering what you can do with the software, it‟s incredibly easy. With several online courses and tutorials teaching the basics of Unity available, you can learn how to get started with it for a very low cost. 9
3. Vuforia Tools Platform Main Components There are three main components to the Vuforia platform.
1.
The Vuforia Engine:
The Vuforia Engine is the client side library that is statically linked to your app. This is available through the client SDK and supports Android, iOS, and UWP. You may use Android Studio, Xcode, Visual Studio, or Unity – the cross platform game engine – to build apps.
2.
Tools:
Vuforia provides tools for creating targets, managing target databases and securing application licenses. The Vuforia Object Scanner (available for Android) helps you easily scan 3D objects into a target format that is compatible with the Vuforia Engine. The Target Manager is a web app on the developer portal that allows you to create databases of targets for use on the device and the cloud (for large numbers of targets). Developers building apps for optical see-through digital eyewear can make use of the Calibration Assistant which enables end-users to create personalized profiles that suit their unique facial geometry. The Vuforia Engine can then use this profile to ensure that content is rendered in the right position. All applications need a license key to work. The License Manager allows you to create and manage your license keys and associated service plans.
3.
Cloud Recognition Service:
Vuforia also offers a Cloud Recognition Service for when your app needs to recognize a large set of images or if the database is frequently updated. The Vuforia Web Services API allows you to manage these large image databases in the cloud efficiently and enables you to automate your workflows by direct integration into your content management systems.
10
4. Getting Started with Vuforia 1. The Image Targets in Unity video tutorial will show you how to author a simple Vuforia project using Image Targets in Unity. 2. How To Setup a Unity Project will walk you through the steps necessary for setting-up, customizing and building a simple Vuforia Unity project for both iOS and Android. 3.
The Vuforia Play Mode for Unity video tutorial will show you how to rapidly prototype your app in Unity using the Vuforia simulator in Play Mode. [8]
4.1 How To Install the Vuforia Unity Extension The Vuforia Unity Extension enables the development of vision based apps using the Unity Game Engine. Developers must obtain the Unity Editor, along with the Unity Android and/or Unity iOS mobile plugins from Unity Technologies. To use Vuforia in Unity, you'll need to import the Vuforia Unity Extension into your Unity project Note: The Vuforia Unity Extension is compatible with both Unity Standard and Unity Pro. Visit the Unity website for further information about Unity and how to download it. If you already have Unity and need to migrate an existing Vuforia project to a new extension version, see: How to Migrate a Unity Project importing the extension [8] 1. Download the Vuforia Unity Extension. 2. Open your Unity project or create a new one. 3. Import the Vuforia Unity Extension by either double clicking the extensions *.unity package file on your file system, or by selecting Assets/Import Package/Custom Package in the Editor's menu and then selecting the *.unity package on your file system. 4. The extension archive will self-install into your project, adding the following folders, with plugins and libraries, to your project. You will use the online Target Manager to create and manage the target images that your app can recognize and track. The Vuforia Target Manager 11
Fig. 4.1 Unity3D screen shot of project section, Ref. Vuforia Developer portal
Guide provides an introduction to working with the Target Manager to create targets and target databases. For tips on selecting and designing target images, review the Image Targets Guide. [8]
4.2 How To Setup a Simple Unity Project It's easy to set-up a basic Vuforia project in Unity. Follow these steps to import the Vuforia Unity extension and to add and configure the prefabs and assets used to develop Vuforia apps in Unity. 4.2.1
Create a project i.
Create a new project in Unity.
ii.
Download the Vuforia Unity Extension
iii.
Browse to the vuforia-unity-xx-yy-zz.unitypackage you just downloaded double-click on the file.
iv. 4.2.2
Accept the import of the package into your Unity project.
Obtain a License Key and Creating Targets
You will need create a license key for your app in the License Manager and add this key to your project. Please visit: How To Create a License Key How To add a License Key to your Vuforia App You will use the online Target Manager to create and manage the target images that your app can recognize and track. The Vuforia Target Manager Guide provides an introduction to working with the Target Manager to create targets and target databases. [8] .
12 Fig. 4.2 Platform Anatomy, Ref. Vuforia Developer portal
4.2.3
Adding Targets:
Next, you need to add a Device Database to your project. You can do this in two ways: Create a target on the Target Manager
OR
Use existing targets from other projects 1. To use the Target Manager method, see Vuforia Target Manager to create and download a package. 2. Double-click the downloaded package or right-click on Unity Project for Import Package and "Custom Package. 3. Select the downloaded package. 4. Click Import to import the target Device Database. [10] If you are copying the Device Database files from another project, be sure to copy any files located in the Editor/QCAR/ImageTargetTextures folder. These will be used to texture the target plane in the Unity editor. 4.2.4
You should now see the following folder structure Inside Unity:
i.
Project Folders
Fig. 4.3 Unity3D Project section, Ref. Vuforia Developer Portal
Editor - Contains the scripts required to interact dynamically with Target data in the Unity editor Plugins - Contains Java and native binaries that integrate the Vuforia AR SDK with the Unity Android or Unity iOS application 13
Vuforia - Contains the prefabs and scripts required to bring augmented
reality to your Unity application Streaming Assets / QCAR - Contains the Device Database configuration XML and DAT files downloaded from the online Target Manager [8]
ii.
Add AR assets and prefabs to scene
1. Now that you have imported the Vuforia AR Extension for Unity, you can easily adapt your project to use augmented reality. 2. Open the /Vuforia/Prefabs folder 3. Delete the Main Camera in your current scene hierarchy, and drag an instance of the AR Camera prefab into your scene. The AR Camera is responsible for rendering the camera image in the background and manipulating scene objects to react to tracking data. 4. With the AR Camera in place and the target assets available in the Streaming Assets/QCAR folder, run the application on a supported device, and see the live video in the background. 5. Drag an instance of the Image Target prefab into your scene. This prefab represents a single instance of an Fig. 4.4 Unity3D Inspector section, Ref. Vuforia developer portal Image Target object. 6. Select the Image Target object in your scene, and look at the Inspector. There should be an Image Target Behavior attached, with a property named Data Set. This property contains a drop-down list of all available Data Sets for this project. When a Data Set is selected, the Image Target property drop-down is filled with a list of the targets available in that Data Set. 7. Select the Data Set and Image Target from your Streaming Assets/QCAR project. In this example, we choose "StonesAndChips". (It is automatically populated from the Device Database XML file that is downloaded from the online Target Manager). The Unity sample apps come with several Image Targets. To use them, copy them from the Image Targets sample, or create your own at the Target Manager section of this site. 14
NOTE: When you added the Image Target object to your scene, a gray plane object appeared. This object is a placeholder for actual Image Targets. In the inspector view of the Image Target there is a pop-up list called Image Target. From this list, you can choose any Image Target that has been defined in one of the Streaming Assets/QCAR datasets so that the Image Target object in your scene adopts the size and shape from the Image Target it represents. The object is also textured with the same image from which the Image Target was created. [8] iii.
Add 3D objects to scene and attach to trackables
Now you can bind 3D content to your Image Target. 1. As a test, create a simple Cube object (GameObject > 3D Object > Cube). 2. Add the cube as a child of the Image Target object by selecting it in the Hierarchy list and dragging it onto the Image Target item. 3. Move the cube in the scene until it is centered on the Image Target. You can also add a Directional Light to the scene (Game Object > Light > Directional Light). iv.
TrackableEventHandler
The Default Tractable Event Handler (DefaultTrackableEventHandler) is a script component of the Image Target that causes the cube you just drew to appear or disappear automatically an automatic reaction to the appearance of the target in the video. You can override this default behavior by revising the DefaultTrackableEventHandler script or writing your own by implementing the ITrackableEventHandler interface. v.
Adding Dataset load to camera
The Vuforia SDK has the ability to use multiple active Device Databases simultaneously. To demonstrate this capability, you can borrow the StonesAndChips and Tarmac Device Databases from the Image Targets sample and configure both to load and activate in the Datasets field of the ARCamera s Vuforia Configuration asset, which is accessible from the ARCamera's Inspector panel via the Open Vuforia Configuration button. You can also search for VuforiaConfiguration in the Project panel. This allows you to use targets from both Device Databases at the same time in your Unity scene.
15
Fig. 4.5 Vuforia configuration, Ref. Vuforia developer portal
4.3 Deploy the application The next step is to deploy your application to a supported device.
4.3.1 Android deployment process: Unity provides a number of settings when building for Android devices select from the menu (File > Build Settings > Player Settings ) to see the current settings. Also, choose your platform now Android or iOS. 1. Click Resolution and Presentation to select the required Default Orientation. 2. Click Icon to set your application icon. 3. Click Other Settings. Set the Minimum API Level to Android 4.1 'JellyBean' (API level 16) or higher. Set Bundle Identifier to a valid name (e.g., com.mycompany.firstARapp). 4. Save your scene (File > Save Scene). 5. Open the build menu (File > Build Settings ). Make sure that your scene is part of Scenes in Build. If not, do one of the following:
Use Add Current to add the currently active scene. Drag and drop your saved AR scene from the project view into the Window.
16
You can now build the application. Attach your Android device and then click Build And Run to initialize the deployment process.
4.3.2 iOS deployment process: Unity provides a number of settings when building for iOS devices (File > Build Settings > Platform > iOS icon). 1. Before building, select the required Default Orientation. Note: The Vuforia AR Extension now supports Auto Rotation. 2. Make sure that Target Platform is not set to armv6 (OpenGL ES 1.1). As of Vuforia version 6.2, the Unity extension supports OpenGL-ES 2 and 3. 3. Make sure that Bundle Identifier is set to the correct value for your iOS developer profile. 4. Now you can choose to build the application. First, save your scene (File > Save Scene). 5. Open the build menu (File > Build Settings). 6. Make sure that your scene is part of Scenes in Build. If this is not the case: a. Use Add Current to add the currently active scene. OR b. Drag and drop your saved AR scene from the project views into the Window. 7. Press Build And Run to initialize the deployment process. [3] [8]
Fig. 4.6 Output of simple AR project, Ref. Vuforia Developer Portal
17
5. Main Project: Augmented KUKA Robot 5.1 Stage 1: Procedure for Vuforia and Unity3D
Sign up to Vuforia from developer portal of Vuforia – Login:
Develop License manager Add license key Development Type App name Confirm Copy the License key under the mentioned App name. [10]
Fig. 5.1 Create License Key, Ref. Vuforia License manager
Target manager database Give Target Brows database (ALL)
Download Vuforia unity package from Downloads
Open Unity3D
File
Drag Vuforia unity package, Image target database and fbx file that we have already
Download HS Weingarten Logo Target manager Add a name Create Click on the created database Add the Logo picture Mention width Add Download Unity editor Download
New
Build setting
Project name Select Android platform
Switch platform
created under Assets section (procedure under section 5.2) and follow Save scene as Save
File
Go to Assets Main camera
Select AR Camera under Inspector Open Vuforia configuration Paste the license key generated Datasets Load HS-Weingarten Logo database Activate.
Select the Image Target
Drag fbx file on to the Image target in the hierarchy
Click Play and place the logo in front of the Camera. It will show Augmented Model of robot.
Vuforia- Prefabs Drag AR Camera under hierarchy Drag image target too
Database
Delete
Select database Scale it.
Fig.5.2 Augmented Visualization of KUKA Robot, Ref. Screen shot from mobile App
19
5.2 Stage 2: Procedure for VRML model in Blender and creating .fbx file
Import VRML KUKA robot model to Blender by following procedure. Open Blender
File
Import
X3D Extensible 3D (.x3d/.wrl)
Fig. 5.3 VRML KUKA Robot model in Blender, Ref. Screen shot of model in Unity3D
Rigging: After importing VRML model to blender, follow this procedure for rigging. [1] Select cursor to the center (Shift+S)
Go to „Add‟
Armature
Single Bone
Fig. 5.4 Rigging process of KUKA robot, Ref. Screen shot of model in Unity3D
Place the bone to the point at which the body is supposed to rotate. We can rotate the bone by selecting the bone and pressing R key, scaling can be done by S key. 20
Extrude the bone by selecting E key to the next axis position where the rotation starts. Follow the same process till all the axes are rigged. [10]
Limiting the rotation: To limit the rotation for a particular bone, Select the bone rotation.
Click on „Bone constraints‟
Add bone constraints
Limit
Depending on the rotation of axes, limit the X, Y, Z axes rotation by giving minimum and maximum value of rotations. [1]
Fig. 5.5 Limiting rotations of axes of robot, Ref. Screen shot of model in unity3D
Once all the procedure is done, parenting the object to the bones should be done by selecting the part of the object and shift selecting the respective bone by Ctrl+P and Bone.
Fig. 5.6 Final rigging of KUKA robot and parenting object to the Bone, Ref. Screen shot of model in Unity3D 21
One the rigid procedure is done; the axes of the robot are ready to rotate in desire manner.
Fig. 5.7 Final model of KUKA robot after rigging, Ref. Screen shot of model in Unity3D
To convert .fbx : Unity 3D accept only .fbx files, so blender file should be converted to .fbx by selecting all the objects by „A‟ key, [10] [1] Go to File
Export
FBX (.fbx)
check „Selected Object‟
Export FBX
Fig. 5.8 Visualization of KUKA robot without controls, Ref. Mobile screen shot
22
5.3 Stage 3: Program for augmented KUKA robot Unity3D uses C#, and Unity script (Java Script), which is supported by the source code in C++, and C++ plugin support (source code, and plugins require process). All development is done using your choice of C#, Boo, or a dialect of JavaScript. Boo is a CLI language with very similar syntax to Python; it is, however, statically typed at has a few other differences. It's not Python; it just looks similar. The version of JavaScript used by Unity is also a CLI language, and is compiled. Unity takes your C#/JS/Boo code and compiles it to run on iOS, Android, PC, Mac, Xbox, PS3, Wii, or web plugin. [2] [5]
Program for KUKA Robot using using using using
System.Collections; System.Collections.Generic; UnityEngine; UnityEngine.SceneManagement;
public class ARMManaging : MonoBehaviour { public float SpeedRot,SpeedRot2; public bool right1, right2, right3, right4; public bool left1, left2, left3, left4; public float angle; float moveHorizontal; float moveHorizontal1; float rot; float rotS; public Transform MainBody, Joint1, Joint2,mainRotator, smallJoint, Rotator; // Use this for initialization [2] void Start () { moveHorizontal = 0; }
23
// Update is called once per frame void Update () { if (left1) { MainBody.Rotate (Vector3.up * SpeedRot * Time.deltaTime); } if (right1) { MainBody.Rotate (-Vector3.up * SpeedRot * Time.deltaTime); } ////////////////// if (left2) { //print (Joint1.localEulerAngles.z + "left2"); if (Joint1.localEulerAngles.z >= 1 && Joint1.localEulerAngles.z =0 && Joint1.localEulerAngles.z 70 && Joint1.localEulerAngles.z < 90 ) Joint1.localEulerAngles = new Vector3 (Joint1.localEulerAngles.x, Joint1.localEulerAngles.y, 69); } rotManager (-80f, 30f, Joint2.transform, 50, moveHorizontal); rotManager (-70f, 70f, smallJoint.transform, 50, moveHorizontal1); if (rot > 0) { mainRotator.Rotate (-Vector3.right* 150 * Time.deltaTime); } if (rot < 0) { mainRotator.Rotate (Vector3.right* 150 * Time.deltaTime); } if (rotS < 0) { Rotator.Rotate (Vector3.right* 150 * Time.deltaTime); } if (rotS > 0) { Rotator.Rotate (Vector3.right* 150 * Time.deltaTime); } } 25
public void rotManager(float min,float max, Transform rotationtomove, float Speed, float movHorizontal) { Quaternion rotationMin = Quaternion.Euler(new Vector3 (0f, 0f, min)); Quaternion rotationMax = Quaternion.Euler(new Vector3 (0f, 0f, max)); Quaternion rotations = rotationtomove.localRotation; /*if (rotations.z = rotationMin.z) { rotations.z -= Quaternion.Euler (new Vector3 (0f, 0f, Speed * Time.deltaTime)).z; print ("working2"); } rotationtomove.localRotation = rotations; } //1 public void Left1Up () { left1 = false; }
26
public void Left1Down () { left1 = true; } public void Right1Up () { right1 = false; } public void Right1Down () { right1 = true; } //2 public void left2Up () { left2 = false; } public void left2Down () { left2 = true; } public void Right2UP () { right2 = false; } public void Right2Down () { right2 = true; } //3 public void left3UP () { 27
moveHorizontal = 0; } public void left3Down() { moveHorizontal = -1; } public void Right3UP() { moveHorizontal = 0; } public void Right3Down() { moveHorizontal = 1; } // public void Left4Up () { moveHorizontal1 = 0; } public void Left4Down () { moveHorizontal1 = -1; } public void Right4Up () { moveHorizontal1 = 0; }
28
public void Right4down () { moveHorizontal1 = 1; } public void reset () { MainBody.transform.localEulerAngles = new Vector3 (0, 0, 0); Joint1.transform.localEulerAngles = new Vector3 (0, 0, 2); Joint2.transform.localEulerAngles = new Vector3 (0, 0, 0); smallJoint.transform.localEulerAngles = new Vector3 (0, 0, 0); mainRotator.localEulerAngles = Vector3.zero; } public void RrotDown() { rot = 1; //mainRotator.Rotate (Vector3.right* 100 * Time.deltaTime); } public void RrotUp() { rot = 0; //mainRotator.Rotate (-Vector3.right* 100 * Time.deltaTime); } public void LrotDown() { rot = -1; //mainRotator.Rotate (Vector3.right* 100 * Time.deltaTime); } public void LrotUP() { rot = 0; //mainRotator.Rotate (Vector3.right* 100 * Time.deltaTime); } public void srotU() 29
{ rotS = 0; //Rotator.Rotate (Vector3.right* 100 * Time.deltaTime); } public void srotD() { rotS = 1; //Rotator.Rotate (-Vector3.right * 100 * Time.deltaTime); } public void srotDL() { rotS = -1; //Rotator.Rotate (-Vector3.right * 100 * Time.deltaTime); } public void srotUL() { rotS = 0; //Rotator.Rotate (-Vector3.right * 100 * Time.deltaTime); } public void home() { SceneManager.LoadScene ("MENU"); } }
30
5.4 Stage 4: Creating Controller (Buttons) for augmented KUKA robot
Download the button images depending on your requirement into a folder and drag this folder to Sprites under Assets section.
Game object
UI
Button Source image Select the button image Set Native Size Button created.
To Add C# Script: Fig. 5.9 Design of Buttons for controller, Ref. Screen shot in Unity3D Click on Image target Image target name In Inspector section Add Component New Script and give name „ARMManaging‟ Create and Add Click on the script.
C# programming compiler will get open in MonoDevelop where the script has to be written for the joystick controls.
31
Fig. 5.10 Final design of controller for augmented KUKA robot, Ref. Screen shot in Unity3D
5.5 Stage 5: Android App development A. Setting up the Android SDK Tools The first thing we need to install is the Java Development Kit (known as the JDK).
Go to the Java site to download the most recent JDK. It‟s labelled as "Java Platform (JDK)". Choose the one with the highest version number. Simply run the installer and follow instructions in the wizard to install it. Next, we need to install the Android SDK Tools. Go to the Android Developer site. Download the Android SDK Tools (also referred to on the site as "the command line tools"), rather than the full download of Android Studio. Unzip the downloaded file. The resulting directory is the directory that contains the Android SDK Tools.
Put the directory in a memorable, accessible location - you‟ll need to tell Unity where this is directory later. Open the directory that contains the Android SDK Tools, and navigate to „tools‟. Double click the file called „android‟ to run it. A popup will appear, showing a list of packages that can be installed. By default, the core packages for building and the package for the latest version of the Android OS are selected.
32
Fig. 5.11 Installation of Android SDK tool, Ref. Unity3D tutorial If the device you‟re building to runs the latest version of Android, you don‟t need to select anything else. If the device you‟re building to runs an earlier version of Android, scroll down the list of packages and select the version you need.
Click Install [x] packages to start the installation process. You will be prompted to accept the licenses for these packages.
For each package that you wish to install, select the license on the left, select Accept License, and then click Install to proceed.
B. Preparing Unity project for building to Android We now need to return to Unity and switch platforms so that we can build our game for Android.
Within Unity, open the Build Settings from the top menu (File Build Settings). Highlight Android from the list of platforms on the left and choose Switch Platform at the bottom of the window.
33
Fig. 5.12 Build setting window, Ref. Unity3D tutorial
Open the Player Settings in the Inspector panel (Edit Project Settings Player). Expand the section at the bottom called Other Settings, and enter your chosen bundle identifier where it says Bundle identifier.
Fig. 5.13 Identification Bundle Identifier, Ref. Unity 3D tutorial Finally, we need to tell Unity where we installed the Android SDK Tools.
Using the top menu, navigate to Unity Preferences (on OSX) or Edit Preferences (on Windows). When the Preferences window opens, navigate to „External Tools‟. Where it says „Android SDK Location‟, click „Browse‟, navigate to where you put the directory containing Android SDK Tools and click „Choose‟.
34
Fig. 5.14 External Tools, Ref. Unity3D Tutorial C. Building an Android project using Unity:
Connect your Android device to your computer using a USB cable. You may see a prompt asking you to confirm that you wish to enable USB debugging on the device. If so, click „OK‟. In Unity, open the Build Settings from the top menu (File Build Settings). Click „Add Open Scenes‟ to add the Main scene to the list of scenes that will be built. Click „Build And Run‟. Fig. 5.15 Build setting window, Ref. Unity3D tutorial
35
6. Applications of AR in industrial robotics: There are number of applications of augmented reality in industrial robotics. Some of the applications are as follows, 1. Human-robot interactive cooperation (Visual alert): The functionality aimed at
enhancing the safety and awareness of the operator involves the automated alerts that originate from the collaboration scheme between human and robot. The difference from previous functionalities is that these alerts are automatically triggered whenever an event with safety implications is triggered (e.g. when robot is starting to move in AUTO mode) [9]
Fig. 6.1 AR in manufacturing, Ref. 48th CIRP Conference on MANUFACTURING SYSTEMS 2. ABB‟s holographic computer for industrial robot: ABB‟s holographic computer project augmented model of robot into human‟s eyes and one can experience as it‟s a part of real world. This virtual experience allow user to modify work stations in augmented model by pinching operation and see which things we need to modify from the real operation. [9]
Fig. 6.2 ABB’s holographic industrial robot, Ref. Blog on LinkedIn written by Anna Vondrackova
36
7. Mapping virtual robot with real industrial robot There is an App designed by a student from NYU that utilizes augmented reality (AR) to allow users to tell robots where to go and what to do. The app uses the camera on smart devices to “capture” a scene. It then overlays markers on designated “virtual objects” that can be then spatially manipulated within the app, using gestures. These taps, swipes and finger-drawn lines on the smart device‟s screen translates into corresponding movements or actions from the robots in the real world.
Fig.7.1 Positioning fiducial markers, Ref. IEEE Spectrum It has built with a software development platform, Xcode, to create a virtual grid with a coordinate system. The virtual objects as defined by the user are placed within these virtual coordinates, and visual tags called fiducial markers are placed on whatever the user wants to control within this virtual space, whether that‟s a robot or another item that needs to be moved. The smart device‟s built-in sensors such as its accelerometers, gyroscopes, and magnetometers are also brought into play when establishing this virtual scene.
Fig. 7.2 Virtual View of model, Ref. IEEE Spectrum 37
Fig.7.3 Pick and place augmented operation, Ref. IEEE Specturm With this virtual stage set up in this way, the scene is then captured using the device‟s camera. Once that is done, the user is now able to give commands via the smart device by manipulating the scene‟s virtual objects. These instructions are relayed through WiFi to the robots, which are equipped with Raspberry Pi boards as the primary controller that processes these commands.
Fig.7.4 Pick and place industrial robot operation, Ref. IEEE Spectrum 38
The main advantage of this system is that it doesn‟t need special equipment. “Unlike the methods that are conventionally used to interact with sophisticated teams of robots, our approach does not require purchasing or installing any additional hardware or software and does not require the interaction to be in a traditional laboratory environment.
Fig.7.5 Mapping of Augmented and real industrial robot, Ref. IEEE Spectrum One could potentially just take out a smart device with the app installed, snap a scene, and conveniently begin to control a group of robots that are connected to the system. A tool like this would be more relatively more mobile and could have huge implications for better integrating robots into everyday life, and into many industries.
39
8. Final Augmented Model of KUKA Robot
Fig.8.1 Augmented visualization and control of KUKA robot in App, Ref. Mobile screen shots
40
Reference [1] Robot Arm Animation Course_Blender: http://blenderbuzzcourses.teachable.com/p/robot-arm-course-in-blender [2] „Augmented Reality for android application Development‟, Jens Grubert and Dr. Raphael Grasset, [PACKT] publication [3] „Mobile Phones as a Platform for Augmented Reality’, paper published by ‘Dieter Schmalstieg and Daniel Wagner’, Graz University of Technology [4] Types of Augmented Reality, an article published by ‘Realty Technologies’ http://www.realitytechnologies.com/augmented-reality [5] How to program in C# - BASICS - Beginner Tutorial published by Brackeys: https://www.youtube.com/watch?v=pSiIHe2uZ2w&list=PLPV2KyIb3jR6ZkG8gZwJYSjnXxm fPAl51 [6] Wikipedia: https://en.wikipedia.org/wikiList_of_augmented_reality_software.. [7] Augmented Reality: Linking real and virtual worlds A new paradigm for interacting with computers Wendy E. Mackay Department of Computer Science Université de Paris-Sud ORSAY-CEDEX, FRANCE [8] Vuforia Developer Portal: https://developer.vuforia.com/ [9] 48th CIRP Conference on MANUFACTURING SYSTEMS - CIRP CMS 2015 Augmented reality (AR), ‘applications for supporting human-robot interactive cooperation’ by George Michalosa, Panagiotis Karagiannisa, Sotiris Makrisa, Önder Tokçalarb, George Chryssolourisa. [10] Unity 3D Learn Module: https://unity3d.com/learn.
41