A Toolkit for building Continuous Media Applications Tatsuo Nakajima
[email protected], http://mmmc.jaist.ac.jp:8000/ Japan Advanced Institute of Science and Technology 1-1 Asahidai, Tatsunokuchi, Ishikawa, 923-12, JAPAN
Abstract
Multimedia computing has emerged in the last few years as a major area of computer science. However, multimedia programming is very hard since programmers need to take into account many complex facilities such as real-time processing, media synchronization, and dynamic QOS control. Since usual programmers take a long time to understand these facilities, multimedia toolkits that can hide such complex facilities from programmers should be provided for making multimedia programming easier. The toolkit should also provide mechanisms for implementing large continuous media applications. Especially, programmers may expect to reuse existing applications for building their applications quickly. If continuous media applications can be constructed by composing a small number of big modules, programmers can create continuous media applications very quickly. In this paper, we describe a continuous media toolkit that has been developing at Multimedia and Mobile Computing Group at Japan Advanced Institute of Science and Technology. The continuous media toolkit has the following two characteristics. Application programmers do not need to take into account real-time processing, media synchronization, and dynamic QOS control since the continuous media toolkit hides such complexities from programmers. The toolkit provides a scripting language that en-
ables programmers to reuse existing programs. Thus, programmers can create multimedia applications by modifying a small part of existing programs. Our toolkit is implemented on Real-Time Mach Microkernel. Currently, several continuous media applications are created for demonstrating the eectiveness of our toolkit. 1
Introduction
In future computing environments, continuous media data such as audio and video will become very popular data types for building advanced distributed applications. Since audio and video are commonly referred to as timingdependent continuous media, continuous media applications should take into account real-time resource managements and inter-stream synchronization for ensuring their timing constraints. Also, these applications should support dynamic QOS control[11, 2] for controlling the quality of media according to system load since the applications may be executed on various types of computers. Building continuous media applications on standard computers and networks such as Unix require very hard
programming[5]. Some researchers proposed continuous media toolkits for such environments, and their toolkits make it easy to create continuous media applications[7, 4, 9, 17]. The toolkits do not ensure to satisfy the timing constraints of continuous media, but the facilities for ensuring timing constraints require very hard eorts for understanding how to use them. Therefore, we believe that the advantage of continuous media toolkits is not only to make it easy to create continuous media applications. Continuous media toolkits can hide such complex facilities for ensuring the timing constraints from programmers[2]. In this paper, we describe a continuous media toolkit that has been developing at Multimedia and Mobile Computing Group at Japan Advanced Institute of Science and Technology. The continuous media toolkit has the following two characteristics. Application programmers do not need to take into ac-
count real-time programming, media synchronization, and dynamic QOS control since our toolkit hides such complex facilities from programmers.
The toolkit provides a scripting language that en-
ables programmers to reuse existing programs. Thus, programmers can create multimedia applications by modifying a small part of existing programs.
The toolkit is implemented on Real-Time Mach microkernel[19], and the toolkit uses real-time resource managements, processor reservation, memory reservation provided by Real-Time Mach for ensuring timing constraints of continuous media. Also, the toolkit supports inter-media synchronization and dynamic QOS control. Therefore, programmers can create continuous media applications by composing several modules and de ning the relationship between media streams without taking into account such complex facilities. Our toolkit also provides mechanisms for implementing large continuous media applications. Continuous media applications can be constructed by composing several big modules for creating the applications very quickly. Also, the big modules can be created from existing modules by modifying a small part of the modules. The scripting language of our toolkit enables programmers to de ne a big module by composing several small modules. Also, our scripting language allows programmers to modify a small part of existing modules for creating new modules quickly. The remainder of this paper is structured as follows. In Section 2, we describe the requirements for building continuous media toolkit. Section 3 presents a continuous media toolkit implemented on Real-Time Mach. In Section 4, we describe some experiments with our toolkit. The section 5 is summary of this paper.
Media
Programmers should de ne a big composite module
A toolkit for building continuous media applications should require to satisfy the following two requirements. Continuous media toolkits should hide complex facilities for ensuring timing constraints of continuous media from programmers.
Programmers should create a new module by modify-
2
Requirements Toolkit
for
Continuous
Continuous media toolkits should support mecha-
nisms for creating continuous media applications easily. We believe that the following three facilities make it dicult to build continuous media applications, and they should be hidden from usual programmers. Continuous media applications should adopt real-time resource managements for ensuring the timing constraints of video and audio. Continuous media applications should take into ac-
count media synchronization for ensuring the quality of multiple streams.
Continuous media applications should change the
quality of media according to system load. The rst facility is real-time resource managements such as real-time scheduling, real-time synchronization and real-time communication[10, 19]. Also, processor reservation[1] and memory reservation[14] are used to protect CPU and memory resources of continuous media applications from malicious applications. The facilities ensure that video frames and audio samples should be deliver to a speaker and a display until their timestamps do not exceed the current time. If the timing constraints are violated, the quality of the video and the audio stream is signi cantly degraded. However, using such facilities makes programming continuous media applications very hard. The second facility is media synchronization managements[17, 20]. The facility manages the synchronization between multiple media streams. Especially, the synchronization of an audio stream and a video stream should not be violated for ensuring the quality of media streams since the recognition of a voice stream may be dicult when the synchronization between a voice stream and a video stream that contains people who speaks the voice is violated. Continuous media toolkits should support media synchronization managements that make it hard to program continuous media applications. The third facility is dynamic QOS control managements[12, 13]. The facility changes the quality of media according to system load. For example, if the power of a computer is not enough to ensure the timing constraints of a media stream, continuous media applications should degrade the quality of media in order not to violate the timing constraints. For example, the quality of a video stream can be degraded by reducing the frame rate or the resolution of respective frames. The facility requires that programmers acquire detailed knowledges of continuous media. Our toolkit enables programmers to create continuous media applications by connecting several modules. The approach makes it dramatically easy to create continuous media applications. However, traditional approaches do not support mechanisms for building large continuous media applications. We believe that continuous media toolkits should satisfying the following two issues.
from several small modules, and continuous media applications should consist of a small number of big composite modules.
ing a small part of a existing big composite module. The development costs of continuous media applications are signi cantly reduced by satisfying the above issues. The rst issue allows programmers to create large continuous media applications very quickly. Also, the second issue decreases an amount of codes that programmers should write for new applications. Our toolkit is decomposed into two components, and the components solve the above requirements respectively. The rst component is called a stream manager. The component makes it possible to hide real-time programming, inter-stream synchronization, and dynamic QOS control from programmers. The second component is a scripting language that enables programmers to de ne a composite module that can be reused easily for creating a new module. In the next section, we describe how our toolkit solves the above problems. 3
A Continuous Media Toolkit on RealTime Mach
In this section, we describe the continuous media toolkit that is implemented on Real-Time Mach. In Section 3.1, the overview of our continuous media toolkit is presented. Section 3.2 describes the stream manager that provides mechanisms for controlling media streams, and we describe a scripting language provided by our toolkit in Section 3.3. 3.1
Software Architecture
Out−of−band Component GUI, Module Control
Commands Callbacks Media flow Source
Filter
Sink
In−Band Component
Figure 1: Continuous Media Toolkit Our continuous media toolkit adopts a software architecture in which continuous media applications consist of two components as shown in Figure 1. Although the software architecture is adopted in other continuous media toolkit such as VuSystem developed at MIT[7] and Medusa developed at ORL[4], our toolkit provides more powerful abstraction to programmers. Before describing unique features of our toolkit, we present the software architecture that is adopted in our toolkit. The rst component in the software architecture is called an in-band component. The component contains
modules that process media data. Continuous media applications are constructed by connecting several modules, and the connection of the modules de ne media streams. In VuSystem and Medusa, the software architecture is adopted for making programming continuous media applications easy. In our toolkit, the software architecture is also used to hide complex facilities for ensuring timing constraints of media data from programmers. Our toolkit extends the basic software architecture for incorporating inter-stream synchronization and dynamic QOS control. The extended architecture allows programmers to de ne the relationship between media streams without taking into account media synchronization and dynamic QOS control. Continuous Media Applications
Command Controller Deliver start, stop command
Stream Controller stream synchronization, Dynamic QOS control
Module Controller connect modules, process media data
Figure 3: Structure of Stream Manager a module controller that manages the connections between modules. The second component is called a stream controller that manages the relationship between media streams. The last component is a command controller that delivers commands to modules in respective streams. 3.2.1
Module Controller
As described in the previous section, our toolkit enables programmers to create a continuous media application by Scripting Language connecting several modules. For example, a MPEG player Interpreter and Run−Time is constructed by connecting three modules. The rst module fetches video frames from a storage system. The second module decompresses MPEG video frames, and the third module draws the video frames to a display. By adopting the software architecture described in Section Stream Manager 3.1, a video monitor application can be constructed by replacing the storage module of the MPEG player to a camera module. In our toolkit, modules are classi ed into three catFigure 2: Structure of Our Continuous Media Toolkit egories, sink module, lter module, and source module. Since modules provide a uniform interface, a stream is structured by connecting modules linearly as shown in FigThe second component is called an out-of-band com- ure 4. ponent. The component contains a code that de nes the con gurations of continuous media applications. The conSource 1 Filter 1 Sink 1 guration de nes the connections between media modules, and the relationship between media streams. Also, Audio Stream the component contains a code for user interface. Programmers that need not to create new modules can create continuous media applications by considering only out-ofVideo Atream M7 band components. VuSystem and Medusa provide scriptFilter 3 Sink 2 Source 2 ing languages for making it easy to program applications. Filter 2 Since the scripting languages enable us to change programs while executing them, these languages make applications dramatically exible. Also, the program can be debugged Figure 4: Streams easily since it is executed by the interpreter. The out-ofband component of our toolkit also provides a scripting Filter modules that have both one input port and one language and its interpreter as shown in Figure 2. Thus, the toolkit enables us to create applications whose con g- output port transform input media streams to output urations are dynamically changed during the execution. streams. Any lter modules can be cascaded by using the Our scripting language is more powerful than the lan- uniform interface. Examples of lter modules are compresguages provided by VuSystem and Medusa since it enables sion modules and video conversion modules. The source us to de ne a big composite module from small existing modules that have one output port usually interface to inmodules, and the big module can be changed incremen- put devices. For example, a module that retrieves video streams from disks and a video capture module are source tally by replacing some modules in the big module. modules. The sink modules that have one input port usually interface to output devices. A module that interfaces 3.2 Stream Manager to a window system for presenting video frames is a typical The in-band component of our toolkit is called a stream sink module. The modules have two kinds of interfaces. The rst manager. The stream manager consists of three components as shown in Figure 3. The rst component is called interface is called by a stream controller. The interface is
used for setting attributes and requesting commands such as start and stop to the modules. The second interface is used for sending media data between modules which are structured as a stream. Each module should ensure the timing constraints of media data by using real-time facilities provided by RealTime Mach1 , but the complexities caused by real-time programming are hidden from other components. Thus, if programmers need not to create a new module, they do not take care of real-time issues.
Composite Stream 1
Audio Stream 1
Video Stream 1 Composite Stream 2
stream port
Audio Stream 2
Filter 2
Filter 1
Source stream link 1
stream link 2
Video Stream 2
Sink stream link 3
Composite Stream 3
Figure 6: Composit Stream
Figure 5: Modules and Links In our toolkit, an input stream port of one module and an output port of another module is connected by using a stream link as shown in Figure 5. The stream controller manages the links and the connections between links and modules. Since two modules are connected indirectly by using stream links, continuous media applications can change the con gurations dynamically by replacing a module connected to a link. Thus, an application can change its con guration by loading a program written by the scripting language dynamically to the application. In fact, a shell interpreting a program written by the scripting language enables us to change the application's con guration interactively. 3.2.2
Stream Controller
If a continuous media application contains several media streams, they may require to be controlled in the same way. In typical continuous media applications, audio streams and video streams must be synchronized for ensuring the quality of the streams. Also, continuous media applications containing several video streams may require to degrade these qualities at the same time. As described in Section 2, media synchronization and dynamic QOS control solve the problems. The stream controller provides the stream abstraction to programmers. The programmers can group several media streams into a composite stream as shown in Figure 6. The stream controller manages the relationship between media streams, and ensures the synchronization of all media streams grouped in a composite stream. Also, the dynamic QOS control provided by the stream controller manages the qualities of grouped media streams in the same way according to system load. The stream controller also manages to deliver commands from an out-of band component to respective modules in cooperation with the command controller described in the next section. Figure 6 shows four media streams. Each stream contains three modules that are connected in a linear order. 1 In
[20], the experiences with programming of continuous me-
dia applications in Real-Time Mach are described, and the paper concluded that programming continuous media applications require deep knowledges for real-time programming.
In the example, the audio stream 1 and the video stream 1 are grouped into the composite stream 1, and the audio stream 2 and the video stream 2 are grouped into the composite stream 2. Also, the two composite streams are grouped into the composite stream 3. As described in the previous paragraph, the synchronization between these four streams are ensured. Also, the qualities of the streams are managed in the same way. For example, the quality of the video stream 2 will be degraded when the quality of the video stream 1 is degraded due to the heavy load of a system. When a command is delivered to the composite stream 3, the stream controller delivers the command to respective modules in the four streams. For example, if a start command is received by the composite stream 3, the command is delivered to the composite stream 1 and composite stream 2. Then, the stream controller sends the command to modules in the respective streams. Therefore, if programmers de ne the relationship between media streams, it is not necessary to know how to deliver commands to respective modules. 3.2.3
Command Controller
The command controller manages delivering commands from the out-of-band component to respective modules. Also, it delivers callback commands from modules to the out-band component. Several commands such as a start and a stop command are prede ned in the command controller. The commands are implemented as objects. When a command is issued, the object indicating the command is passed to the stream controller. The stream controller calls the function indicating the command according to the relationship between media streams. The command controller allows programmers to add a new command by de ning a new object indicating the command. However, the command may require a dierent ordering policy from the policies for prede ned commands. For example, a start and a stop command is delivered from source modules to sink modules in a sequential order. Because some media elements are stored in the intermediate buer if the sink module receives a stop command before
the source module does it. The media elements will be compose several modules to a big module, programmers dropped since their timing constraints are violated. can create continuous media applications by composing a small number of big composite modules[6]. The approach makes it faster to create applications. The second problem is that there is no support a mechanism to modify existstart ing applications for creating new applications quickly. In traditional approaches, there is a low level of abstraction for structuring applications. The left picture in Figure8 shows the approach. Programmers should know how modstop ules contained in the application are connected and how media streams are grouped. If the application contains Audio Stream Cmd Delivery many modules, it is dicult to reuse the application for creating a new application. On the other hand, the right picture in the gure shows a alternative approach. The apVideo Stream proach provides multiple levels of abstraction and allows programmers to choose the appropriate level of detail for Cmd Delivery a given implementation. If programmers like to reuse applications, they can pick up the right level of abstraction, and modify the application by focusing on the level. Figure 7: Command Delivery The command controller allows programmers to de ne dierent ordering policies for respective commands. As shown in Figure 7, each command can de ne its own command delivery object that contains an ordering policy for the command. When a command object is passed to the stream controller, the command is delivered to respective leaf streams. The stream controller executes the code contained in the command delivery object for respective leaf streams, and the methods for the command in respective modules are called according to the ordering policy. 3.3
Scripting Language for Continuous Media Toolkit
In this section, we describe the scripting language provided by our toolkit. The stream manager described in the previous section provides a set of mechanisms for building continuous media applications. A program written by the scripting language determines the con guration of a continuous media application. Also, the program decides policies how the mechanisms provided by the stream manager is used. The clean separation between the con guration of modules and the implementations of the modules makes the development of applications easy. As described in Section 3.3.1, the scripting language allows programmers to compose a big composite modules from a small number of existing modules. Since the composite module can be extended or customized incrementally, programmers can create a continuous media application by modifying a small part of an existing module. In our toolkit, the out-of-band component contains an interpreter for executing a program written by the scripting language, and a run-time system that contains a code for calling the stream manager. Also, the run-time system contains an event system that is described in Section 3.3.2.
Traditional Approach
Our Approach
Figure 8: Extensible Modules Our toolkit provides two mechanisms for solving the problems described in the previous paragraph. The rst mechanism enables programmers to group several modules and streams into a big composite module. The second mechanism is an event system. The event system enables programmers to connect several modules similar to a mechanism for connecting modules using stream links. The stream link determines how media elements are transfered between modules. On the other hand, the event system determines how commands are transfered between modules. The mechanism is very useful for providing multiple levels of abstraction in out-of-band components. However, if a composite module is seen as a black box, programmers need to know all details of the module, when they want to reuse it. On the other hand, our scripting language allows programmers to focus on the appropriate level of abstraction. This improves the reusability of applications signi cantly. 3.3.2
Composite Modules and Event System
Module 2
3.3.1
Composite Modules and Extensibility
The stream manager allows us to create continuous media applications by connecting several modules and de ning the relationship between media streams. However, since it does not support a mechanism for grouping modules and streams into a big module, programmers should know all modules that are necessary in their applications. The approach causes two problems if programmers want to reuse existing applications. The rst problem is that there is no mechanism for grouping several modules into one composite modules. If continuous media toolkits support to
Module 3
Module 1 Module 4
Figure 9: Event System
The event system provided by our toolkit is simple since the simplicity makes it easy to program continuous media applications. As shown in Figure 9, let us assume that 2, 3 , and 4 request to subscribe events. Then, if 1 requests to publish an event, the event is delivered to 2, 3 , and 4 . In our event system, programmers cannot specify the order that the event is delivered to the three modules. The correctness of an application should be ensured if the event is delivered to the modules in any order. The event system provides a suitable mechanism for extending existing composite modules incrementally. The rst way for the extension is to add a module that requests to subscribe an event in the composite module. This does not require to change existing part of the composite module. The second way is to replace some modules included in the composite modules. In a typical case, a module that requests to publish an event is replaced. For example, let us assume that a composite module contains a module that requests to publish an event indicating a start command. The module generates a start command by pushing a button of GUI. Now, a programmer likes to create a module whose functionalities are the same as the existing module. However, the new module requires to generate a start command by recognizing an user's voice. In our approach, the programmer can create a new module from the existing module by replacing a module that generates a start command. Therefore, the approach enables us to modify existing modules incrementally for creating new modules. In our toolkit, a callback noti cation delivered from the stream manager can be treated as an event. For example, when a module drops a video frame due to system overload, the module sends a callback message to the run-time system of the scripting language. Also, if a module detects the degradation of the quality of a media stream, a callback message is delivered to the run-time. The run-time system converts the callback message to an event, and the event is published to an event link that is connected to input event ports of modules. Figure 10 shows how our event system connects modules. The example contains six modules. A module publishes an event via an output event port. Also, it can subscribe an event from an input port. An event link connects an output port and several input ports. In the gure, the module 1 and 3 request to subscribe events that is published to the event link 1 . The module 5 and 6 request to the event link 3, and the module 2 requests to the event link 2 . If the module 4 publishes an event to 1 , the event is delivered to 1 and 3 . Also, 4 publishes an event to 2 , the event is delivered to 2 . In this approach, the composite module can be modi ed easily by recon guring event links between modules. Figure 11 shows an example that is a simple application using the event system. In the example, the composite module XDisplay contains two modules, VideoConv and XOutput. XDisplay has an input stream port for receiving video frames, and an input event port for receiving commands from other modules. The exported input event port is connected to VideoConv and XOutput by using an event link in XDisplay. Also, XOutput and XOutput are connected by using a stream link for transmitting video frames between the modules. Also, the output stream port of the module Camera is connected to the input stream port of XDisplay, and the output event port of the module Start is connected to the input event ports of XDisplay and Camera. Therefore,
Module Module Module Module Module Module
M
el
M
el
el
el
el
M M
event port
Module
M
M M
M
M
M
M3 M1
el1
M5 el3 M4
M2 el2
M6
Figure 10: Modules connected by Events a start command from Start is delivered to both XDisplay and Camera. Then, the command is also delivered to VideoConv and XOutput in the module XDisplay.
l1
l2
VideoConv
l3
Camera
XOutput e2
XDisplay e1
Start
Figure 11: Structure of Continuous Media Application using the Script Language The advantage of our approach is that programmers can replace modules in composite modules by choosing the appropriate level of abstraction. For example, let us assume that a new composite module that consists of Start, Camera and XDisplay is created. We call the module Monitor. If a programmer requires to create a new module that monitors a room by a camera. The module Monitor may be reused without modi cation. However, he wants to start the monitoring by recognizing the gesture of a user. In this case, our approach allows the programmer to replace Start in Monitor to GestureStart that delivers a start command when a user issues a sign to a camera by recon guring an event link. In traditional approaches, the programmer may create a composite module by composing GestureStart, Camera, VideoConv, and XOutput, then he must take into account the lowest level of abstraction of the module Monitor. On the other hand, he needs not to take into account the internal of XDisplay in our approach. Thus, the approach enables programmers to reuse existing modules by focusing the appropriate level of abstraction, and changing a small part of the modules. Thus, a new applications can be constructed by composing a small number of modi ed composite modules. 3.3.3
Sample
Application
Language
using
Our
Scripting
The scripting language provided by our toolkit is an extension of the TCL scripting language[16]. Our scripting
language can call functions provides by the stream manager, and utilizes the event system described in the previous section. In this section, we show a sample program that uses the scripting language. The sample program contains two streams that fetch media elements from our continuous media storage system[21]. The sample program de nes a video stream and an audio stream, and the streams are grouped into a composite stream. In the example, there are two composite modules, Control and QtPlay. The line 1 is the de nition of the composite module Control. Control contains a code for user interface. The module creates two event links .start and .stop in the line 2 and 3. If a start button is pushed, a start command is published to the event link .start. When pushing a stop button, a stop command is published to the event link .stop. Lastly, in the line 8, a start and a stop button are shown in a display. From the line 11 to the line 36, the composite module QtPlay is de ned. Three modules for processing a video stream, and two modules for processing an audio stream are created from the line 12 to the line 16. Also, two streams .st1 and .st2 are created and grouped from the line 17 to the line 19. Also, from the line 20 to the line 25, event links are created, and the links are connected to the event ports of QtPlay, that are used for connecting the ports to the event ports of other modules. Thus, a start command and a stop command can be delivered to the stream .st. Several media links are created and modules for processing media elements are connected from the line 26 to the line 34. Here, .st1 is de ned as a video stream, and the stream .st2 is de ned as an audio stream. In the line 35, a video window is appeared in a display. From the line 38 and the line 46, Control and QtPlay are connected using the event links .start and .stop, and the application starts to be executed. Control
g
pack .b1 .b2
module -newtype QtPlay f module .vsrc -type VideoCrasSrc module .vc -type VideoConv module .vo -type XOutput module .asrc -type AudioCrasSrc module .ao -type AudioOutput stream element .st1 stream element .st2 stream group .st -with .st1 -with .st2 event link .ev1 event link .ev2 event link .ev1 connect -fromn .start -to .st.start event link .ev2 connect -fromn .stop -to .st.stop media link .ml1 -stream .st1 media link .ml2 -stream .st1 media link .ml3 -stream .st2 media link .ml1 connect -from vsrc.videon -to .vc.video media link .ml2 connect -from .vc.videon -to .vo.video media link .ml3 connect -from .asrc.audion -to .ao.audio pack .vo g event link .start event link .stop module .gui -type Control module .qtplay -type QtPlay event link .start connect -from .gui.startn -to .qtplay.start event link .stop connect -from .gui.stopn -to .qtplay.stop pack .gui .main
Stop
Start
.gui.stop
.gui.start .start
08: 09: 10: 11: 12: 13: 14: 15: 16: 17: 18: 19: 20: 21: 22: 23: 24: 25: 26: 27: 28: 29: 30: 31: 32: 33: 34: 35: 36: 37: 38: 39: 40: 41: 42: 43: 44: 45: 46:
.stop
.qtplay.start
.qtplay.stop
.ev2
.ev1 AudioCrasSrc
AudioOutput
.st.start
.st.stop
The sample program shows that programmers need not take into account real-time programming, media synchronization, and dynamic QOS control if our scripting language is adopted. Also, the example shows that composite modules make it quick to create a complex continuous media application without great eorts.
.st2 .ml3 VideoCrasSrc
VideoConv
.ml1
.ml2
4
XOutput
.st1
.st1
QtPlay
Figure 12: Example of Continuous Media Application
01: module -newtype Control f 02: event link .start 03: event link .stop 04: button .b1 -text start -command f 05: event link .start publishg 06: button .b2 -text stop -command f 07: event link .stop publishg
Discussion
Our continuous media toolkit signi cantly reduces the eorts for implementing continuous media applications that ensure the timing constraints of audio and video. However, our experiments with the toolkit show that the current implementation has several limitations. In this section, we present four problems caused by the limitations. The event system used for creating composite modules causes the rst problem. In the current implementation, programmers need to ensure that programs should run correctly even if events delivered to multiple modules are processed in any order. The current event system does not support mechanisms for controlling the order. However, the powerful event system that can control the order of events increases the complexities of programming. Since one of the purposes of the toolkit is to make programming continuous media applications easy. Thus, the powerful
event system that makes programming hard is not prefer- Acknowledgments able. We need to investigate a powerful event system that We are grateful to Jun Noritake and Kouki Oohira for does not make programming continuous media applicavery helpful discussion. We also would like to thank the tions hard. numbers of the MMMC Project in JAIST for their valuThe second problem occurs due to the software archi- able comments and inputs. tecture adopted in our toolkit. The software architecture allows us to build continuous media applications that contains multiple media streams easily. However, we assume References that each stream has one source module and one sink mod[1] H.Fujita, H.Tezuka, and T.Nakajima, \A Processor Reservation ule. The assumption makes it dicult to build applicaSystem supporting Dynamic QOS control", In Proceedings of the tions that communicate with many to many such as con2nd International Workshop on Real-Time Computing, Systems, and Applications, IEEE, 1995. ference systems. MBONE tools provide Conference Bus[8] that is good abstraction for building such applications. We [2] S.Furuno, and T.Nakajima, \A Toolkit for building Continuous Media Applications using a New Dynamic QOS Control Scheme", need to investigate that Conference Bus can be integrated Multimedia Japan, 1996. with the stream concept adopted in our toolkit. [3] A.Hokimoto, K.Kurihara, T.Nakajima, \An Approach for ConThe third problem occurs due to the exibility of our structing Mobile Applications using Service Proxies", The 16th Intoolkit. Our toolkit allows programmers to change interternational Conference on Distributed Computing Systems, 1996. stream synchronization algorithms and dynamic QOS con[4] A. Hopper. \The Medusa Applications Environment", Technical trol policies for their applications. However, the toolkit Report 94.12, Olivetti Research Limited, 1994. does not provide a mechanism for selecting which algo[5] V.Jacobson, \Multimedia Conferencing on the Internet", SIGrithms are suitable for their applications. Also, our toolkit COMM'94 Tutorial, 1994. is not suitable when dierent streams in an application re[6] S. Jaeger. \Mega-Widgets in Tcl/tk: Evaluation and Analysis", In quire dierent algorithms. Moreover, we need to investiProceedings of the Tcl/Tk Workshop, 1995. gate whether algorithms for inter-stream synchronization and dynamic QOS control can be selected independently. [7] C.J. Lindblad and D.L. Tennenhouse. \The VuSystem: A Programming System for Compute-Intensive Multimedia", IEEE Journal The last problem occurs when the toolkit is used with of Selected Areas in Communications, 1996. other libraries such as a window library and a database [8] S. McCanne, and V. Jacobson, \vic: A Flexible Framework for library. Our toolkit assumes that programmers do not Packet Video", ACM Multimedia, 1995. take into account control ows of applications. This means that the toolkit manages the entire control ows of the [9] S. McCanne, et. al., \Toward a Common Infrastructure for Multimedia-Networking Middleware", NOSSDAV'97, 1997. applications. Thus, a problem may occur if the libraries that is used with our toolkit have their control policies. [10] T.Nakajima,T.Kitayama,H.Arakawa,and H.Tokuda, \Integrated Management of Priority Inversion in Real-Time Mach", In ProFor example, the X window toolkit has its own control ceedings of the Real-Time System Symposium' 93, IEEE, 1993 loop for processing events from users. If our toolkit is used Nakajima and H. Tezuka. \A Continuous Media Application with other libraries, programmers may require to modify [11] T. Supporting Dynamic QOS Control on Real-Time Mach", In Proa large part of our toolkit or the libraries for resolving the ceedings of the ACM Multimedia, 1994. con ict between their control policies. [12] T.Nakajima, \A Dynamic QOS Control based on Optimistic Processor Reservation", In Proceedings of the 3rd International Conference on Multimedia Computing and Systems, IEEE, 1996.
5
Conclusion
In this paper, we presented a continuous media toolkit implemented on Real-Time Mach. The toolkit makes it very easy to build continuous media applications that ensure the timing constraints of continuous media such as audio and video since programmers need not to take into account complex facilities such as real-time programming, inter-stream synchronization, and dynamic QOS control. Also, the scripting language allows programmers to create composite modules. The event system provided by the toolkit can compose small modules into one big module in a loosely fashion. The composite modules allow programmers to replace some internal modules for creating new composite modules from existing composite modules. The approach enables programmers to create large continuous media applications without great eorts. One of goals of our toolkit is to support for building mobile continuous media applications. Our plan is to integrate our toolkit with the service proxy toolkit[3, 15] that provides mechanisms for building adaptive mobile applications. The extended toolkit should enable programmers to build a mobile continuous media applications that can be executed on small and less powerful mobile computers. Also, the toolkit should enable programmers to build continuous media applications that can be moved everywhere according to the location of users.
[13] T.Nakajima, and H. Fujita, \Experience with Adaptive QOS Mapping Scheme", In Proceedings of the 3rd International Workshop on Real-Time Computing, Systems, and Applications, IEEE, 1996. [14] T. Nakajima, and H. Tezuka, \Virtual Memory Management for Interactive Continuous Media Applications", In Proceedings of the 4th International Conference on Multimedia Computing and Systems, 1997. [15] T. Nakajima, and A. Hokimoto, \ Adaptive Continuous Media Applications in Mobile Computing Environments", In Proceedings of the 4th International Conference on Multimedia Computing and Systems, 1997. [16] J.J.Ousterhout, \Tcl and the Tk Toolkit", Addison-Wesley, 1994. [17] K. Rothermel, I. Barth, and T. Helbig, \CINEMA - An Architecture for Con gurable Distributed Multimedia Applications", Technical Report 3/1994, University of Stuttgart/IPVR, 1994. [18] K. Rothermel, and T. Helbig, \An Adaptive Protocol for Synchronizing Media Streams", ACM/Springer Multimedia Systems, 1996. [19] H. Tokuda, T. Nakajima, and P. Rao. \Real-Time Mach: Towards a Predictable Real-Time System", In Proceedings of Usenix First Mach Symposium, 1990. [20] H.Tezuka, and T.Nakajima, \Experiences with building a Continuous Media Application on Real-Time Mach", In Proceedings of the 2nd International Workshop on Real-Time Computing, Systems, and Applications, IEEE, 1995. [21] H. Tezuka, and T. Nakajima \Simple Continuous Media Storage Server on Real-Time Mach", In Proceedings of USENIX Technical Conference, 1996.