A Large Ultra High Resolution Tiled Display System - CiteSeerX

9 downloads 11927 Views 2MB Size Report
architecture, for example, can be enabled by Chromium. [5] middleware. ... commodity desktop PC, and 4K uncompressed streaming at 18 frames per second on ...
A Large Ultra High Resolution Tiled Display System: Architecture, Technologies, Applications, and Tools Sachin Deshpande, Chang Yuan, Scott Daly, Ibrahim Sezan Sharp Laboratories of America, Camas, WA 98607, USA ABSTRACT We describe architecture, new applications and their enabling technologies of SharpWall - a tiled display system with 10Kx4.5K resolution and 177-inch diagonal size - which we have built as a prototype to research future large-size ultra-high-resolution single panel displays.

48 Port E/FE/G E

L/A

L/A

1 1G 0

1 1G 0

3 2

3 2

5 4

5 4

7 6

7 6

9 8

9 8

11 10

11 10

13 12

13 12

15 14

15 14

L/ A 1 7 1G 1 6

L/ A 1 7 1G 1 6

1 9 1 8

1 9 1 8

21 20

21 20

23 22

23 22

25 24

25 24

27 26

27 26

29 28

29 28

31 30

31 30

Sta tu s

Sta tu s

L/ A 33 1G 32

L/ A 33 1G 32

35 34

35 34

37 36

37 36

39 38

39 38

41 40

41 40

43 42

43 42

45 44

45 44

47 46

47 46

LC-EF3-G E-4 8T

LC-EF3-G E-4 8T

2. ARCHITECTURE AND MIDDLEWARE Majority of the current tiled display systems are driven by a cluster of computers. In a typical tiled display architecture, a set of “display nodes” (computers) drive individual tiles of the display. Often a single computer

48 Port E/FE/G E

1. INTRODUCTION Displays with larger screen size and higher resolution are expected to become increasingly affordable and ubiquitous. Large displays are often used in certain niche markets such as public displays and digital signage. These include displays at public places such as airports, museums, hotels, stadiums, hospitals, malls, etc. These displays are often constructed using tiles of individual displays. Also in academia, universities, research institutes, and corporations large wall sized displays are often built from individual display panels. Such large sized tiled displays are used for scientific and medical visualization applications [1]. Examples of such tiled displays include: LambdaVision display at University of Illinois at Chicago’s Electronic Visualization Laboratory [2], Stallion tiled display at Texas Advanced Computing Center [3], Stanford School of Medicine tiled display [4]. In this paper, we describe SharpWall - a large ultra high resolution tiled display system we have built at Sharp Laboratories of America. We first describe its architecture and middleware. This is followed by description of its high resolution 4K uncompressed and compressed video playback capability. We then introduce our two specific technologies that we have developed for large displays: viewer reactive display and spatial audio. Following a discussion of our scalable user interface toolkit which can be used for application development on large displays, we finally explain a distributed inter-tile video synchronization algorithm and a synchronization design tool for tiled displays. Our focus has been on interactive displays and developing technologies to enable convenient single user or multi-user interaction with single or multiple applications, and applications that react to users’ positions.

node can drive two display tiles from a single graphics card utilizing two DVI connections. Depending upon the type of middleware, the display nodes may show the data that is rendered on one or more of the display nodes. Such architecture, for example, can be enabled by Chromium [5] middleware. In the architecture we use in our SharpWall, a set of “rendering nodes” which are separate from the “display nodes” serve the job of rendering application data. The rendered application data is then transmitted on a high speed network in compressed/uncompressed form to the display nodes. In the SharpWall design the data can be transmitted on a 10 Gbps fiber optic network or 1Gbps Ethernet. In our architecture, the display nodes and rendering nodes use the middleware - Scalable Adaptive Graphics Environment (SAGE) [6]. SAGE is a specialized graphics streaming architecture and middleware for enabling data, high-definition video and extremely high-resolution graphics to be streamed in real-time from distributed rendering and storage clusters to scalable display walls. Figure 1 shows the architecture of SharpWall display system that we have built using 20 Sharp Aquos® LCD panels tiled together. The SharpWall shown in Figure 2 measures 177 inches (diagonally) and has a resolution of 10K x 4.5 K pixels.

Figure 1: SharpWall tiled display system architecture

3. HIGH RESOLUTION – 4K VIDEO PLAYBACK SharpWall supports high resolution - 4K (4096x 2160) uncompressed/compressed video streaming and smooth playback over LAN and WAN. 4K and higher resolution video can enable a number of high quality applications such as life-like remote tele-presence, high fidelity

scientific visualization, digital cinema, etc. We built a streaming media server which provides 4K uncompressed streaming using low cost commodity hardware components. Our goal is to achieve real-time frame rate for 4K uncompressed streaming. For this, we have developed following optimizations and technologies:

Figure 2: SharpWall tiled display system showing 4K streaming video playback (The video shown is copyright of CineGrid) Network transmission packet size, Maximum Transmission Unit (MTU), and TCP parameters optimization • Multi-rail technologies for increasing network throughput • Smooth, jitter-free media playback with optimized sender and receiver side buffering Our current performance is 4K compressed streaming at more than 24 frames per second on 1 Gbps network using commodity desktop PC, and 4K uncompressed streaming at 18 frames per second on 10 Gbps network with a low cost commodity server. Figure 2 shows a screenshot of •

our SharpWall showing a 4K streaming video. 4. VIEWER REACTIVE DISPLAY Our specific focus has been interactive large high-resolution displays, rather than passive displays, e.g., signage displays. We have therefore developed viewer interaction technologies to allow the display to react to viewer’s presence and actions. Besides the traditional keyboard, mouse, and remote control, viewers can control the display by moving around in front of the display. Viewers are asked to wear lightweight infra-red reflective markers and are tracked by multiple infra-red cameras. The scene rendered on the display is updated based on viewer’s tracked 3D position in real-time (>=30FPS). One application of the viewer reactive display technology is called “virtual see-through window” [7]: the viewers move in front of the display and experience different parts of the rendered scene, as if they are seeing the scene through a physical window in real world, as shown in Figure 3(a). 3D graphic scenes are rendered with the viewer’s position as its virtual viewpoint and updated with viewer’s changing position to ensure correct 3D perspective. Alternatively, high resolution still images of real-world landscapes are also rendered in similar fashion to give the viewer the impression of looking at the scene through a window. Another application is called “viewer following windows”, as shown in Figure 3(b), in which each viewer is associated with an image/video application window. Each viewer wearing different IR markers are tracked and recognized, so that their corresponding windows move along with the viewers spontaneously. 5. SPATIAL AUDIO Humans use spatial localization of sound sources everyday. Spatial hearing helps in “cocktail party effect” which allows focusing one’s listening attention on a single

Figure 3 (a): virtual see-through window

Figure 3 (b) : viewer following window

Audio for this window is reproduced by these two loudspeakers

Audio for this window is reproduced by these three loudspeakers

Figure 4: Tiled display spatial audio: Audio moves and resizes with window movement and resize operations

audio stream among a cacophony of audio streams and background noise. Currently high quality surround sound and spatial audio systems, e.g. 10.2 channel surround [8], PerAmbio 6.1.10 [9], NHK 22.2 [10], Ambisonics [11], Allosphere [12] exist. These systems so far have limited deployment. Limitations of these high quality spatial audio systems for large sized wall displays are: special content needs to be created for these systems, special encoding and decoding is required for the audio channels, also special hardware, DSPs are required due to the complexity. We have developed a spatial audio system for large displays. Our system (Figure 4) provides spatial audio based on an application window’s location on the wall display. In our system, audio associated with an AV application with a window at a certain location on the tiled display is reproduced by the audio reproduction devices associated with that tiled display area. The audio reproduction location moves and resizes when the AV application window is moved and resized on the tiled display. Our approach allows conveying application height information in contrast to popular consumer X.Y audio formats (e.g. 5.1, 7.1 audio) which do not provide this information. Also the application height information is conveyed without special content encoding. Our approach supports spatial audio from multiple concurrent on-screen AV windows. We can handle any X.Y format audio content and spatialize it based on the particular display geometry and loudspeaker configuration in real-time. Our approach can work with a loudspeaker array or with a limited number of discrete loudspeakers. We have implemented this spatial audio system without requiring any special hardware, by reusing existing surround sound cards and developing special software to utilize the discrete individual audio channels supported by them. 5. SCALABLE USER INTERFACE TOOLKIT With our partner University of Illinois Chicago’s

Electronic Visualization Laboratory we have developed a scalable user interface toolkit which can be used for tiled display application development. The toolkit supports developing applications on tiled display with familiar desktop look and feel. The toolkit supports following features:  Scalability: User interface components automatically adapt to the target display size, resolution and application window size.  Interaction Device Independent: Allows users to interact with the application using any supported user interaction device (e.g. Wiimote, Gyro mouse, etc.)  Multi-applications, Multi-User: Multiple applications can use the toolkit and multiple users can simultaneously interact with one or more applications Figure 5 shows a screenshot where scalable user interface components overlaid on the application window scale automatically with the window size.

Figure 5: Scalable Interface components auto-scale with window size 6. SYNCHRONIZATION DESIGN TOOL AND DISTRIBUTED INTER-TILE SYNCHRONIZATION ALGORITHM In the SharpWall architecture described above, rendering nodes send parts of overall image to be displayed to individual display nodes. The display nodes

Figure 6: Process for creation of synchronization mismatch video measurement set (Target configuration: 2x2 tiled display)

then utilize a distributed synchronization algorithm and individually display parts of the image on their display tiles to provide the overall perception of a single contiguous image. We have developed a two-phase distributed synchronization algorithm which can achieve low-latency inter-tile synchronization for multiple applications with different frame rates. Additionally we have developed a psycho-visual design tool for synchronization-mismatch-perception evaluation for large ultra high resolution tiled displays. The tool uses tile matrix dimensions, tile resolution, tile bezel size for the target tile display and can provide design parameters for an inter-tile synchronization algorithm. This tool can be used for design of low cost tiled display systems which utilize distributed synchronization mechanisms for image display and do not require high-end graphics cards with Genlock/Frame Lock capability. Figure 6 shows the process for creation of synchronization mismatch video measurement set (for a 2x2 tiled configuration) that is used by the design tool. 6. REFERENCES [1] D. Gaba, J. Stringer, S. S. Y. Kung, “Display Walls in Healthcare Education: What, Why, How”, Stanford School of Medicine website, http://summit.stanford.edu/ pdfs/DisplayWallGIRfinal.pdf [2] University of Illinois at Chicago’s Electronic Visualization Laboratory’s LambdaVision Tiled Display website, http://www.evl.uic.edu/cavern/lambdavision/ [3] Texas Advanced Computing Center Stallion Tiled Display website, http://services.tacc.utexas.edu/ index.php/stallion-user-guide

[4] Stanford

Display Wall website, http://summit.stanford.edu//research/displaywall. html [5] Humphreys G., Houston M., Ng R., Frank R., Ahern S., Kirchner P. D., Klosowski, J. T., “Chromium: a stream-processing framework for interactive rendering on clusters,” Conference on Computer Graphics and Interactive Techniques, SIGGRAPH July 2002 [6] Jeong, B., Renambot, L., Jagodic, R., Singh, R., Aguilera, J., Johnson, A., Leigh, J., “High-Performance Dynamic Graphics Streaming for Scalable Adaptive Graphics Environment,” Proc. Supercomputing, 2006 [7] C. Yuan, “Creating Virtual 3D See-Through Experiences on Large-size 2D Displays”, IEEE Virtual Reality 2009, pp. 237-238 [8] G. DellaSella, “Introducing the 10.2 Surround Format,” Audioholics Magazine, Sept. 2006 [9] R. Miller, "Compatible PanAmbiophonic 4.1 and PerAmbiophonic 6.1 Surround Sound for Advanced Television–Beyond ITU 5.1," in SMPTE 144th Int'l Convention, vol. 2, October 2002 [10] K. Hamasaki, T. Nishiguchi, R. Okumura, Y. Nakayama, and A. Ando, "A 22.2 Multichannel Sound System for Ultrahigh-Definition TV," in SMPTE Technical Conference and Exhibition, vol. 117: 2008, p. 40 [11] R. Elen, "Ambisonics: The surround alternative," in Proceedings of the 3rd Annual Surround Conference and Technology Showcase, 2001 [12] H. Tobias, llerer, K.-M. JoAnn, and A. Xavier, "The allosphere: a large-scale immersive surround-view instrument," in Proceedings of the 2007 workshop on Emerging displays technologies: images and beyond: the future of displays and interaction, California, ACM, 2007

Suggest Documents