Approach. Extensibility to existing frameworks. X3D (Scene graph), KML, MAF, …
Generality/Flexibility to accommodate. Different AR platforms. Mobile, Desktop ...
X3D Extension for AR Contents Standards 고려대학교 김정현
Approach Extensibility to existing frameworks X3D (Scene graph), KML, MAF, … Generality/Flexibility to accommodate Different AR platforms Mobile, Desktop, HMD, … Sensors and devices Vision based, Marker based, Location based, … Focused on file format vs. Contents representation (e.g. authoring application)
Related work Vision based AR authoring tools BuildAR (HITLabNZ) Marker – virtual object association only Naver (KIST) +NF – virtual object association Instant Reality (Fraunhofer) X3D extension (marker as IO sensor) Location/GPS based AR ARML / KML – GPS coordinate Virtual object/behavior association Behavior TARML (G. Lee / ETRI) Behavioral abstraction
Major proposals Extend “View” node:
Resolution between “live” camera and virtual Define “Live” camera node (G. Lee / ETRI) – Not necessarily for “AR” contents (e.g. Video textures) – Parameters set by user More detailed parameter specification for “View” Set by user Routed from “Live” camera node – With possibility of behavioral manipulation “Routed” from sensor – Camera could be tracked separately Default: same as the world – Note that view can be relative to anything
Major proposal Extending movie texture / background node For AR background Also proposed by G. Lee / Instant Reality
Introduce “Physical Sensor” node Extension of X3D sensors (Physical) Sensor based X3D (not just for AR) Issue: classification ensor type vs. sensor functionality Proximity, Collision, Location, Image patch, … Distance, Contact, GPS, Camera, …
X3D (Virtual) WORLD
View
(Virtual Camera)
Other X3D Nodes Movie Texture*
Virtual and Physical Sensor AR contents (Real/Physical)
Live Camera
ROUTE*
ROUTE For event processing and simulation Propagate events (from sensors, e.g.) and process them in one simulation tick Can be used for various physical-to-virtual mapping, but …
실세계 영상 (Live video) 지원 LiveCamera 노드 Scene 노드 내에 위치 실제 카메라 장치를 나타냄 source field에 카메라 장치의 ID를 설정 Live 영상이 image field로 제공됨 실제 카메라의 internal parameter가 projmat field로 제공 됨 Lens distortion이 기 보정된 image와 projmat 제공 Tracking기능이 지원될 경우 pose 정보 제공 (?) Live Camera { SFString SFImage SFMatrix4f SFBool SFBool SFVec3f SFRotation }
[in, out] [out] [out] [out] [out] [out] [out]
source "default" image projmat "1 0 0 … " on FALSE tracking FALSE position orientation
MovieTexture Node 혼합현실 응용에 있어서의 한계 영상 소스가 URI로 표현됨 Live video 지원을 위해 streaming server 필요 Camera parameter 정보 부재 AR background로서의 역할 한계
MovieTexture Node
실세계 영상 지원: Texture 이용
Live video 지원: Background 이용
: Routing from LiveCam From Live Camera node “image” field To Background (LiveURL field) Shape (MovieTexture field)
카메라 영상 정합 가상 카메라의 parameter를 실제 카메라에 맞춤 Internal parameter = projection matrix External parameter = camera pose Manual Direct specification Routing From the Live camera From the Sensor
W0 T
카메라 영상 정합: Viewpoint node Viewpoint : X3DViewpointNode{ SFMatrix4f [in] projmat SFVec3f [in,out] position SFRotation [in,out] orientation SFNode }
[in,out] liveCamera
Add distortion parameters here
…
카메라 영상 정합: Viewpoint node …
Physical Sensor Parameters Field
Format
Default value
name
string
Sensor#x
Type
char
S
tic_time
time
… …
Value Range
S/M (Single/Multi)
Physical Sensor •
Exports position and orientation (+other values, like detection event) with respect to parent for the sensed object
•
Routed to various X3D nodes • 3D Geometry • Behaviors • 2D objects
•
Type • Value it returns • Parent (Earth, Virtual World, Other node, Sensor, etc.) • Determines specific attributes needed • E.g. Marker will need marker file info. and its dimensions
Physical Sensor: Camera/Marker pos
Format Vec3f
orientation
SFRotation
parent
Node
type
…
filename
File Path
width
float
center
Vec3f
Sensor on tracked object Located in the same position as Live Camera / Tracked Object Exports position and orientation Itself (camera) or of something else with respect to World or something else
Field
Format
Default value
parent
node
this
pos
Vec3f
000
orientation
SFRotation
Camera/sensor relative landmark Static landmarks specified with respect to the camera, world, earth, anything Returns “is_detected” Type Position (/Orientation) Edge / Face
Field
Format
position
Vec3f
orientation
SFRotation
parent
Node
3D Model
type Property …
Multiple sensors (e.g. GPS+Compass) E.g. static landmarks specified with respect to the earth Its recognition instigates contents behavior Type Position + Orientation
Field
Format
Default value
Value Range
type
char
S
S/M (Single/Multi)
device
description
value
Float[]
tic_condition
g/l value
000
< thermometer type=“S” decive=“initial_temperature_sensor#1” tic_condition=“g 20.000”
g : greater than l : lesser than
Example (acceleration)
Script node Script { property
field
name
Default value
exposedField
MFString
url
[]
field
SFBool
directOutput
False
field
SFBool
mustEvaluate
False
#and any number of: eventIn
eventType
eventName
field
fieldType
fieldName
fventOut
eventType
eventName
} (vrml2.0 default script template)
initialValue
Range
Example (GPS + Script) DEF SC_TG_U Script{ mustEvaluate TRUE, directOutput TRUE, eventIn SFTime timeClick, field SFBool onLight FALSE, field SFNode color USE color, field SFColor to1 1 0.5 0, field SFColor to2 0.5 0.5 0.5 url ["javascript: function timeClick(time) { if (onLight==TRUE){ color.diffuseColor = to2; } else { color.diffuseColor = to1; } onLight=!onLight }”]}
Other issues Built in bevaviors Use built in X3D behaviors (interpolators, etc.) Occlusion and depth map Ghost object 이용 깊이 값 추정 실시간 깊이 영상 (depth map) 적용 - 스테레오 비전
Ghost object
Depth map
향후 Summary Physical sensors as bridge between virtual and real Extension of camera, background/MovieTexture, routes Generality – for other types of interaction
향후 More examples Build international consensus Implementation Naver/OSG implementation Authoring tool export