Leap Motion Controller - Google Groups

1 downloads 223 Views 115KB Size Report
Inbuilt JavaScript API, which has Support for most modern browsers and also highly optimized for ... behind this is simp
Initial  comparison  of  two  important  gesture  input  systems  i.e.  Leap  Motion   Controller  and  Microsoft  Kinect.      

Leap  Motion  Controller:  

  Tracking  Capability:     Objects:   The  Leap  Motion  software  uses  an  internal  model  of  a  human  hand  to  provide   predictive  tracking  even  when  parts  of  a  hand  are  not  visible  and  it  can  track  three   things.   • Hands:  Provides  information  about  the  identity,  position,  and  other   characteristics  of  a  detected  hand.   • Fingers:  Provides  information  about  each  finger  on  a  hand.   • Finger-­‐like  tools:  Only  thin,  cylindrical  objects  are  tracked  as  tools.     Motions:   Motions  are  estimates  of  the  basic  types  of  movements  inherent  in  the  change  of  a   user’s  hands  over  a  period  of  time.  This  includes   • Translate:    Average  change  of  positions  of  an  object  or  objects  in  the  scene.   • Scale:  Motion  towards  or  away  from  each  other.   • Rotate:  differential  movement  of  objects  within  the  scene     Gestures:   The  Leap  Motion  software  recognizes  certain  movement  patterns  as  gestures,  which   could  indicate  a  user  intent  or  command.  Currently  it  supports  below  three  types  of   gestures   • Circle:  A  finger  tracing  a  circle.   • Swipe:  A  long,  linear  movement  of  a  hand  and  its  fingers.   • Tap:  A  tapping  movement  by  a  finger.       Distance  Range:   Effective  range  of  device  is  25  to  600  millimeters  above  the  device  (1  inch  to  2  feet),   which  makes  it  perfect  for  close  PC  interaction  [3].     Hardware:   Its  uses  stereovision  principle  to  identify  fingers,  hands  and  gestures  using  optical   trackers  and  it  consists  of   • Three  separate  IR  LED  emitters     • Two  IR  Cameras      

Accuracy:   • The  controller  accuracy  drops  with  the  distance  (radius)  and  when  the   tracking  objects  are  to  the  far  left  or  far  right  [1].   • Significant  drop  in  accuracy  for  the  samples  taken  more  than  250  mm  above   the  controller  [1].   But  it  is  100x  more  accurate  than  anything  available  on  the  market  and  have  1/100th   mm-­‐  tip  of  pin  accuracy  [2].   Web-­‐Integration:   Inbuilt  JavaScript  API,  which  has  Support  for  most  modern  browsers  and  also  highly   optimized  for  rendering.   • leap.js  available  from  http://js.leapmotion.com/   Advantage  over  Microsoft  Kinect:   • The strength of the Leap Motion controller is the accurate level of detail provided by the Leap Motion API. The API provides access to detection data through a direct mapping to hands and fingers.

• • •                                

This contrasts with Microsoft Kinect - where sensory data is returned in a raw format, which must then be cleaned up and interpreted. The benefit of strong API preprocessing is that error reduction can be abstracted away from client applications meaning client applications can be built faster and with consistently accurate data [4]. More than 30K web-based apps. It has got strong presence on web and reason behind this is simple i.e. SDK support for multiple platforms. Small and portable, Low Z height, low CPU and power consumption. It  is  compatible  with  both  USB  2.0  as  well  as  USB  3.0  where  as  Kinect  is   compatible  with  USB  3.0,  and  doesn’t  support  most  USB  over  Ethernet   extenders,  which  makes  it  complicated.  

 

Microsoft  Kinect:    

Tracking  Capabilities:     Kinect   is   a   sensor   used   especially   for   gaming,   that   too   with   a   gaming   console.   Microsoft   did   release   a   windows   version   of   it   but   considering   it's   capabilities,   Gaming  and  Fitness  are  the  most  appropriate  areas  where  Kinect  would  fit.   Skeleton  Tracking:   • Can  track  up  to  6  skeletons  with  25  joints  each,  including  joint  orientation   and  finger  tracking.   • Applications  can  also  track  the  positions  of  user’s  joints  (head,  shoulders,   hips,  hands,  etc.)  in  space.   Face  Tracking:   • Various  attributes  of  the  user’s  face  can  be  tracked,  including  the  relative   positions  of  lips  and  eyebrows,  which  can  be  interpreted  as  facial   expressions.   Multiple  Sensors:   • It  also  allows  tracking  IR  reflective  objects,  while  filtering  out  IR  lights.   • Wider  view  of  space  to  track  objects  closer  to  the  cameras.   Voice  Control:   • Can  detect  open  and  closed  hands  as  well  as  two-­‐finger  lasso  gesture.   • A  larger  field  of  view  allows  the  sensor       Distance  Range:   Kinect  is  perfect  for  the  mid  or  long  range  distance.   • 850  mm  to  4000  mm  range  (normal  Mode)   • 400  mm  to  3000  mm  range  (near  mode)   Hardware:   • 3D  Depth  Sensors   • RGB  Camera   • Multi  Array  MIC   • Motorized  Tilt     Accuracy:   • The  Kinect  cannot  easily  distinguish  individual  fingers  on  a  hand,  which   means  gestures  tend  to  involve  more  gross  movements  than  simple  pointing.   • Up  to  six  users  can  be  recognized  in  the  field  of  view  of  the  sensor  but  only   two  users  can  be  tracked  in  detail.            

Web  Integration:   • The  SDK  and  driver  are  only  available  for  Windows  8.   • Not  all  the  features  are  available  on  the  open  source  toolkits  for  development   like  OpenNI  framework  [6].     Advantages  over  leap:   • Audio  Channel:  Can  be  used  as  microphone  as  well  as  for  speech  recognition   for  e.g.  Multimodal  feedback  “OK”,  “YES”  etc.   • The  Kinect  SDK  provides  access  to  the  raw  depth  data  from  the  sensor,  as   well  as  images  from  the  IR  and  RGB  cameras  and  developer  can  use  this   accordingly.   • Full  body  tracking     *    There  are  lots  of  new  features  and  API  will  be  coming  with  the  new  version  like  Visual  Gesture   Builder,  which  will  use  machine  learning  and  body  frame  date  to  define  a  gesture.    

Final  Verdict:  

According  to  our  use  case  we  are  trying  use  those  input  devices,  which  can  enable   students  to  express  and  interact  with  their  hands.  Furthermore  we  are  looking  for   the  device,  which  have  extensive  support  on  the  web.  In  both  case  leap  motion   controller  wins.  

                                         

References:     [1]  An  Analysis  of  the  Precision  and  Reliability  of  the  Leap  Motion  Sensor  and  Its                  Suitability  for  Static  and  Dynamic  Tracking  by  Jože  Guna  *,  Grega  Jakus,  Matevž                  Pogačnik,  Sašo  Tomažič  and  Jaka  Sodnik,  Sensors  ISSN  2014   [2]  About  the  product  https://www.leapmotion.com/product   [3]  API  Documentation  Reference  https://developer.leapmotion.com   [4]  The  Leap  Motion  controller:  A  view  on  sign  language  by  L  Potter,  J  Araullo,  L                  Carter  ACM  2013.   [5]  Beginning  Kinect  Programming  with  the  Microsoft  Kinect  SDK,  by  J  Webb  &  J                Ashely,  Book  published  2012.   [6]  Integrating  Motion  Tracking  Sensors  to  Human-­‐Computer  Interaction  with                Respect  to  Specific  User  Needs,  M  Vinkler  CESCG  2014.      

           

Suggest Documents