International Conference on Biocomputation, Bioinformatics, and Biomedical Technologies
Next-generation Collaboration Environments for Interactive Tele-medical Consultation Kyungtae Kong, Namgon Kim, Sangwoo Han, JongWon Kim
medical fields [4]. Another example is Eastern Montana Telemedicine Network (EMTN) [5]. To assist residents in rural areas of Montana and Wyoming, EMTN utilized two-way interactive video conferencing to deliver specialists’ medical and mental health services, continuing advanced medical education. Also, in [11], a Web-based application is implemented as a collaborative working environment for physicians by enabling the peer-to-peer exchange of electronic health records. It also treats technological issues such as video, audio and message communication, workspace management, distributed medical data management and exchange. However, these tele-medicine trials are limited in satisfying the needs of quality-stringent medical collaboration [6-7]. To provide accurate diagnostic view, doctors need to exchange (and share) high-quality visuals of affected areas of patients and browse the medical history of them in real time. Occasionally, they may want to share the medical visuals in stereoscopic 3D, as illustrated in [2] between doctors and paramedics for emergency scenario. Thus, in this paper, to provide interactive tele-medical consultation, a next-generation collaboration environment is designed by bridging on-line and off-line medical services together in a place called as CMC (cyber medical center). CMC plays the base infrastructure by providing on/off-line solutions for futuristic medical information sharing. Based on the CMC model, we introduce example service scenarios that improve the effectiveness of interactive tele-medicine consultations. We then switch to the introduction of SMeet (smart meeting space) collaboration environment [8] and explains the architecture of SMeet as well as the functionalities it can support. Finally, by employing a prototype version of SMeet (smart meeting space) collaboration environment, we showcase how to realize one CMC service scenario. Especially, to justify the potential of SMeet prototype in realizing the vision for CMC, proof-of-concept verifications are successfully made by demonstrating high-quality visual sharing capabilities and userinteractive functionalities for tele-medicine consultation. The organization of this paper is as follows. In Section II, after introducing CMC model, we discuss the service scenarios for the interactive tele-medical consultations and the required cyber infrastructures. Section III covers the SMeet interactive collaboration environment, where the hardware and software architectures of SMeet are introduced. The verification result of SMeet prototype for CMC is provided in Section IV. Finally, we conclude this paper in Section V.
Abstract— In this paper, to provide interactive tele-medical consultation, a next-generation collaboration environment is designed by bridging on-line and off-line medical services together in a place called as CMC (cyber medical center). CMC plays the base infrastructure by providing on/off-line solutions for medical information sharing. Based on the CMC model, we introduce example service scenarios that improve the effectiveness of interactive tele-medicine consultations. Then, we switch to the discussion on how to realize CMC service scenarios by employing SMeet (smart meeting space) collaboration environment. Especially, proof-of-concept verifications are successfully made to justify the potential of SMeet prototype in realizing the vision for CMC. Index Terms—Advanced collaboration environment, tele-medical consultation, interactive collaboration, smart meeting space, and cyber medical center.
I. INTRODUCTION
U
biquitous computing environments supported by advanced computing systems and high-performance networks can realize futuristic collaboration among remote sites, where participating users interact naturally to complement their working environments. These kinds of advanced collaboration environments (ACEs) [1] are expected to cover next-generation presence-style business meetings, tele-commuting, distant learning, and tele-medicine [2]. Especially, for cyber-medical collaboration among doctors, ACE can enable doctors to expand their mutual cooperation internationally. In the medical field, doctors frequently make a round of visits and held consultation meetings to discuss about the conditions, diagnoses, and treatments of patients [3]. Thus, by expanding the physical coverage of doctors, the level of collaboration among medical doctors can be improved significantly. For example, a tele-medicine trial of University of Virginia provides patients as well as health professionals with clinic consulting, tele-radiology, and home-health applications in 35
This work was supported by the Foundation of Ubiquitous Computing and Network (UCN) Project, the Ministry of Knowledge Economy (MKE) 21st Century Frontier R&D Program in Korea (08B3-O2-10M). Kyungtae Kong is with the Leading technology research TFT, Future Technology Laboratory, Korea Telecom (KT), Seoul, Korea (e-mail:
[email protected]). Namgon Kim, Sangwoo Han, and JongWon Kim are with the Networked Media Lab., Dept. of Information and Communications, Gwangju Institute of Science and Technology (GIST), Gwangju, Korea (e-mail:
[email protected]).
978-0-7695-3191-5/08 $25.00 © 2008 IEEE DOI 10.1109/BIOTECHNO.2008.28
143
II. CYBER MEDICAL CENTER AND SERVICE SCENARIOS FOR COLLABORATIVE TELE-MEDICINE
collaborative diagnosis. To describe medical episodes (i.e., service scenarios), we apply UML (unified modeling language) after clarifying the use-case diagram that provides the actor with detailed actions to be processed as well as sets of actions in order. Figure 2 shows the use case diagram of tele-radiology collaboration that describes various activities by the actor (e.g., patient, doctor, qualified radiologist).
A. CMC Model for Interactive Tele-medical Consultation To establish the CMC model described in this paper, Korea Telecom (KT) Future Technology Laboratory has conducted extensive case studies regarding the collaborative medical processes and services. In these case studies, following medical trends are evident. Traditional medical diagnoses based on independent, hospital-centered, one-way environments are being transformed into patient-centered, incorporated, two-way, and integrated ones. Thus, CMC should focus on early disease checkup, tele-monitoring, and tele-medical consultation, which are less interfering with essential clinical services of doctors. For example, by linking CMC with remote houses, we can implement a tele-monitoring healthcare service for a chronic disease of old patient. NBIT (Nano-Bio Information Technology)-based diagnosis [9] is also useful for early disease checkups. Especially, tele-medical consultation requires doctors to discuss patients’ condition by sharing their medical records and to make a medical treatment.
B. Required Infrastructure for CMC To implement this tele-radiology collaboration BM, we need to develop diverse systems and databases for CMC, which is partially listed in Table 1. Note that, as part of this CMC infrastructure, we also need an interactive next-generation medical collaboration environment. This environment should entail more than basic (e.g., audio-video sharing only) collaborative tools. To be more specific, the interactive collaboration environment for tele-radiology can visualize ultra-high-quality medical objects on a networked tiled display and support multi-modal-based natural interactions for participants using HCI (human computer interaction) devices to recognize user’s pointing and gestures. We may even support follow-me-type location-aware visualization display [10]. By analyzing objects and data of radiology collaboration episodes, CMC requires the following diverse system modules and databases listed in Table 1. CMC should have a system module for radiology image retrieval to exchange radiological images. It also needs system modules for clinical history management and patient visit management to connect patients with medical history and hospital visit, respectively. Especially, the system module for collaboration environment plays an essential role in CMC. Also, required databases include patient DB, disease DB, doctoral schedule DB, radiological imaging request DB, radiological images DB, interpretation opinion DB, clinical history DB, and others.
Fig. 1: An example service scenario for CMC: Tele-radiology collaboration.
Table 1: System modules and databases required for tele-radiology collaboration episode. Episodes Objects
Requisite system modules
Requisite databases
Fig. 2: Tele-radiology collaboration use case diagram.
Figure 1 shows one of the tele-medical collaboration scenarios, namely tele-radiology collaboration BM (business model). This BM aims to minimize the danger of an erroneous diagnosis caused by lack of communications between doctors diagnosing in different places. Moreover it is designed to assure patients about quality treatment based on the improved accuracy of
144
Episode for tele-radiology collaboration Patient Doctor CMC (Cyber Medical Center) Clinical history management Hospital information/reservation Doctor’s information reference Patient’s visit management PACS and Radiology image retrieval Interpretation management Collaboration environment (including recording) Patient database Disease database Doctoral schedule database Radiological imaging request database Radiological images database Interpretation opinion database Clinical history database
Also, a specific stage of tele-radiology collaboration scenario is detailed in Table 2. This case provides ‘co-work situation’ that is based on the collaborative solution of CMC between radiologist and physician. Table 2 also shows required functions to match this specific medical situation, data needed by
participating objects (e.g., radiology image, interpretation data, and collaboration result data) and required system modules.
and video, large-scale medical visualization contents, medical applications) are shared between collaborators participating in a collaborative work environment through multicast-enabled network. Based on the cyber medical scenario, people are able to discuss something with remote colleagues with the help of interactive collaboration environments which offer high-quality video and audio sharing, three-dimensional medical graphical objects, and closed-up meeting documents over an interactive tiled display. An interactive collaboration environment can be organized with a variety of services and devices. For typical room-based collaboration, the environment has to equip with a set of audio, video, interaction (e.g., smart pointers, hand-motion trackers, localization sensors), display systems (e.g., Full HD plasma displays, networked tiled displays), and supporting machines. All the devices have to be connected high-performance wired local networks, which more than 1 Gbps network capacity is recommended, and they should be connected by 10 Gbps wide area network. Note that the following functionalities are specially considered.
Table 2: Example of service processes for tele-radiology collaboration. Situation
Object Messages between objects (Functions) Data
System
After CT scans, a radiologist asks CMC to diagnose and treat together. The radiologist analyzes the scanned picture with a physician, defines the affected area (lung cancer), and introduces a patient to a general hospital. Radiologist Physician CMC A radiologist takes CT scans. A radiologist asks CMC for collaboration. A radiologist and a physician diagnose together. A radiologist uploads the scanned image on CMC. A radiologist writes out the document of interpretation. Scanned image Document of interpretation Results of collaborative diagnosis Search and reservation of hospitals DB of hospitals and specialists DB of radiologists’ interpretation Automated recording of collaborative diagnosis Collaborative diagnosis and treatment DB of the scanned images
1) Media and networking support: Seamless and high-quality bi-directional media streaming is important to better the quality of user experience for collaboration environments. Using the various audio and video devices, user intension appeared on visual and auditory facts must be acquisitioned and delivered to remote people. For this, networking support (e.g., multicast connectivity, network performance monitoring) is necessary to offer continuous streaming media over time-varying networks.
The following are the expected advantages of this tele-radiology collaboration scenario. From the perspective of patients, they can depend on the treatment of medical specialists thanks to more accurate diagnosis. Doctors can diagnose, treat, and prescribe more professionally, by complementing the initial diagnosis with the specialized interpretation on radiological images. Lastly, radiologists can present their interpretation in more detail and expect financial rewards for their specialized expertise.
2) High-resolution display support: To show precise and realistic medical contents (can be appeared as a couple of full HD videos), the capacity of display system must cover large-scale screen size and ultra-high resolution. For this, the scalable extension of display resolution would be possible. It also should support a variety of video and present multiple and simultaneous video streams encoded by using the codecs over the display systems. Note that networked display capability for tiled display enables multiple high-resolution videos and images received from remote collaboration nodes.
III. SMEET INTERACTIVE COLLABORATION ENVIRONMENTS FOR MEDICAL SCENARIOS In this section, we brief the conceptual vision of an interactive collaboration environment for CMC and propose a system architecture to realize this vision. A. Interactive Collaboration Environment for CMC
Network
3) Display interaction support: For more comfortable collaboration, people can naturally interact with interactive collaboration environments without skilled operation techniques. Interaction devices play a role of user-friendly interaction with the environments. Pointing services and hand-motion tracking services aid people to easily operate visual contents shown on large-scale display system. With the integration with the localization services, the pointer services offer location-aware display pointing services. B. Software Architecture Overview The proposed architecture is depicted in Fig. 4. First, it attempts to build a collaboration service based on unique functionality. By selecting a selected set of component services depending on node capabilities and meeting modes, a collaboration node is able to be flexibly configured. And then, in order to organize services according to the context of given collaboration session, SMeet Mediator is introduced. This
Fig. 3: Interactive collaboration environments for CMC.
Fig. 3 depicts the concept of interactive collaboration environment, where medial contents (e.g., real-time live audio
145
SMeet Mediator is designed to manage all the services and meeting context as node representative. When handling the job of arbitration for collaboration, the SMeet Mediator consults either the inference engine (for automatic operation) or collaboration participants (for manual intervention of users using specific interaction interfaces such as GUI and others). It is believed that harmonization of both automatic and manual consultation will be the ideal target.
networking, display, and interaction categories. Lastly, interface property defines a call method to access this component service. For example, the video producer service provides media-related producer functionality utilizing the video capture device and combined computing/networking capabilities. It can be invoked by specifying parameters such as video format, transport protocol, multicast address, and optional configuration elements. In the followings, the entire component services considered in our approach is briefly described. 1) Media and data service category: The media and data service category provides real-time media transmission to do seamless audio and video communication between participants. A pair of media {producer, consumer} services performs elementary functions to support interactive video conferencing. The media producer service grabs video, encodes it, and delivers streamed media toward other nodes. The media consumer service receives streamed media sent from the media producer service, decodes it, and relays the rendered video to matching display device. The media arbitration service covers several trouble-shooting jobs that improve the QoE (quality of experience) of live media services. It handles the multicast address resolution and capability negotiation (via media tool connector) between multiple media producers and consumers. It also covers the role of dynamic media adaptation among selected producers and consumers to cope with dynamically variation of underlying networks and participating systems.
Component & Composite Services Media & Data
Networking
Data Adaptation Service
File Sharing Service
Multicast Connector Service
Display Interactive Display Control Service
NAT/Firewall Traversal Service
Networked Display Service Display Manager
Network Monitoring Service Media Producer Service
Interactive Graphics Producer Service
Media Consumer Service
Interactive Graphics Consumer Service
Service Discovery
Service Matchmaker
Recommended Configuration
Remote SMeet node
Remote SMeet node
Multimodal Interaction
Meeting Space States Filtering SMeet Mediator Context & Classifying Service Configurator
Services & Devices GUI
Service Compositor
Interaction & Display GUI
.
Decision Engine
Hand Motion Tracking Service
Location Tracking Service
Interaction Manager
SMeet Space GUIs
Contextaware
..
Pointing Service
Software Architecture
Domain Knowledge Policy Multi-users
(Automated) Decision Making
Smart Display System Networked Display Device
Hardware Architecture
2) Networking service category: The networking service category monitors network performance and removes networking barriers such as multicast connectivity problem and NAT/firewall traversal. Basically, media {producer, consumer} services transmit streamed media by multicast-based group communication. If multicast is not available, multicast connector service helps them to continue the media services by providing multicast connectivity through hybrid (of native and overlay) multicast techniques.
Audio Producer/ Consumer Video Consumer
Display Control Machine
Gateway Gateway Gateway
High-performance LAN
SMeet Mediator
Video Producer Smart Pointer
Smart Hand Tracker
Location Tracker
ACE Connector
Services & Devices Discovery
Inter-devices Networking Management
Fig. 4: SMeet software and hardware architecture.
Each component service has elementary and unique function(s) to assist the construction of collaboration node. In order to undertake a certain function in providing value to the advanced collaboration, the service can be realized by offering an access to resources such as devices and software programs (e.g., audio and video tools). In our approach, they are categorized into five functional sets: media and data, networking, display, and interaction as depicted in Fig. 4. These services are developed based on distributed component technology. Thus, each collaboration node is basically allowed to access and control all instances of component services flexibly over the networks, although the actual access/control may be restricted by the corresponding privileges. Note that, by extending this, we can flexibly compose a new service to perform complex tasks later with the SMeet Mediator. A component service includes information to specify its service properties, so-called service capability: role, category, and interface. Role property indicates the role of given service (e.g., producer or consumer roles). Category property presents functionality category of given service among media and data,
3) Display service category: In the display service category, the interactive display control service communicates with the multimodal interaction service in order to control the display device based on user interaction (e.g., pointing). It also enables users to place and resize visual data on any part of display. Then, the networked display service covers the function of presenting the decoded video or rendered graphics to the tiled display via network interface (instead of display connectors like DVI). Note that visualization can be span over several tiles of display, which actually means the rendered visual data is split and separately transmitted to the matching networked display. 4) Multimodal interaction service category: In the interaction service category, we are currently focusing on the user-friendly interaction with tiled display. The pointing service enables a user to point a certain spot on the tiled display and to present the spot on remote tiled displays. It also allows a user to manipulate the tiled display (e.g., moving video). The hand-motion tracking service supports the shared manipulation of 3D graphic objects with remote nodes (e.g., moving and turning following user’s
146
hand motion). Location tracking service is considered to track user position based on wireless location technology.
control three-dimensional graphic models represented on the tiled display by using hand motion trackers. B. SMeet Operation for Medical Collaboration In this section, we introduce a sample service scenario that can be provided by the SMeet prototype. A patient who has served at a hospital in Seoul (Seoul hospital) go business trip to Jeju. To provide continuous medical service, the doctor at Seoul hospital (Dr. Seoul) could find the nearest hospital (Jeju hospital) from the patient’s place in Jeju and ask a doctor at Jeju hospital (Dr. Jeju) to treat the patient. Before Dr. Jeju meet the patient, Dr Jeju should get information about the patient from Dr. Seoul. At this point, the SMeet prototype serves as an information exchanging interface between Dr. Seoul and Dr. Jeju. The SMeet prototype connects doctors; visualizes large medial data on a ultra high-resolution display; supports intuitive interaction with visualization result and shares the interaction result.
IV. SMEET PROTOTYPE: KEY FEATURES AND OPERATION A. Prototype Realization and Key Features The main features of the SMeet prototype system are a tiled display service efficiently visualizing high-resolution media data, a pointing service enabling to control visualization media at the near distance, and hand-motion tracking service aiding to operate 3-dimensional medical contents. Table. 3: SMeet prototype specification. Service Media devices
Tiled display
Display pointer
Hand-motion tracker
Localization sensors Networking
Description Two HD and four PTZ cameras Microphones, speakers, and echo canceller LCD (22 inch) 6 by 4 arrays supporting 96006400 resolution A workstation for display control and 12 workstation for display service Two workstations for display interaction and GUI operation Pointing errors under 40 pixels IR laser pointer and camera (VGA, 29 fps) with IR optical-filter Response time under 0.5 sec Two virtual gloves having multiple acceleration and gyro sensors Accuracy: less than 0.8 m A localization server and six ultra-sound and RF signal based localization sensor nodes 10Gbps multicast-enabled network
Fig. 5: Tiled display service.
The tiled display service realizes ultra-high resolution display by integrating a set of network-capable LCD displays in an interactive collaboration environment. It freely represents visualization contents through given text commends or GUI operations (e.g., showing, hiding, resizing, and moving). The tiled display service receives the raw visualization contents from selected machines supporting media conversion capabilities such as rendering and uncompressing streaming media. Following the given display commends and operations, the tiled display service expresses the raw contents on appropriate display where people want to show them. One more display systems (i.e., physically separate display devices) might be also interworking and represents more large-scale visualization contents over the integrated tiled displays. Display interaction for multi-users is that multiple users conveniently operate visualization contents presented on tiled displays by using interaction methods. This approach enables users to manipulate the contents with multi-modal interaction using user-friendly devices to recognize user action (e.g., touching, pointing, gesture, etc) as well as conventional interaction using keyboard or mouse. When a presenter points at a specific spot on the networked display by using the laser pointer, the pointing spot can be presented on the networked display of remote environments. Listeners placed on remote environments can then understand what the presenter highlights by watching the pointing spot. In addition, when a user makes various gestures indicating physical meaning for collaboration with a pointer, the visual objects of the networked display are operated in a prearranged manner. In a similar way, users can
Fig.6: Realization of tiled display service.
Figure 5 shows the tiled display service provided by the SMeet prototype. Dr. Seoul and Dr. Jeju can visualize X-ray image of the patient on their tiled display. Dr. Jeju can see the details of the X-ray image by moving and resizing the image. Figure 6 shows an implementation of the tiled display service. The pointing service provides display interaction like moving and resizing visualized media on the tiled display. Dr. Seoul can select media on the tiled display using a laser pointer and can resize and move the selected media as depicted in Fig. 7. These display interactions of Dr. Seoul are shared with Dr. Jeju to enhance understanding of each other. Figure 8 shows the realization of the pointing service.
147
V. CONCLUSION To provide interactive tele-medical consultation, a next-generation collaboration environment is designed employing CMC infrastructure. For target medical service scenarios, we demonstrate that SMeet (smart meeting space) collaboration environment can be manipulated to support the required collaboration demands of medical doctors. Also, proof-of-concept verifications are successfully made to justify the potential of SMeet prototype in realizing the CMC vision. However, it should be noted that current prototype is still in its preliminary development stage. For example, it still lacks the integration of high-performance visualization that is directly driven by interactive medical simulations. Also, the integration with interactive browsing of relevant information (stored in data bases) in real-time also belongs to the remaining tasks.
Fig. 7: uT-pointer Pointing service.
REFERENCES [1] R. Stevens, M. E. Papka, and T. Disz, “Prototyping the workspaces of the future,” IEEE Internet Computing, pp. 51-58, 2003. [2] M. H. Söderholm, D. H. Sonnenwald, B. Cairns, J. E. Manning, G. F. Welch, and H. Fuchs, “The potential impact of 3D telepresence technology on task performance in emergency trauma care,” in Proc. ACM Conf. on Supporting Group Work, New York, NY, pp. 79-88. 2007. [3] A. Geissbuhler, C. O. Bagayoko, and O. Ly, “The RAFT network: 5 years of distance continuing medical education and tele-consultations over the Internet in French-speaking Africa,” International Journal of Medical Informatics, vol. 76, no. 5-6, pp. 351-356, May-June 2007. [4] Office of Telemedicine, University of Virginia, Available: http://www.healthsystem.virginia.edu/Internet/telemedicin e/. [5] Eastern Montana Telemedicine Network, Available: http://www.emtn.org/. [6] L. Watts and A. Monk, “Telemedical consultation: Task characteristics,” in Proc. of the SIGCHI Conference on Human Factors in Computing Systems, New York, NY, pp. 534-535, 1997. [7] J. E. Cabral and Y. Kim, “Multimedia systems for telemedicine and their communications requirements,” IEEE Communications Magazine, vol. 34, no. 7, July 1996. [8] S. Han, N. Kim, K. Choi, and J. Kim, “Design of multi-party meeting system for interactive collaboration,” in Proc. IEEE Int. Conf. on Communication System and Software and Middleware, Bangalore, India, Jan. 2007. [9] M. C. Roco and W. S. Bainbridge, “Converging technologies for improving human performance: Nanotechnology, biotechnology, information technology and cognitive science,” NSF/DOC-sponsored report, Jun. 2002. [10] S.-K. Ng, “Smart bio-laboratories of the future,” in Proc. of the IEEE International Symposium on Circuits and Systems, May 2005. [11] I. Maglogiannis, K. Delakouridis, and L. Kazatzopoulos, “Enabling collaborative medical diagnosis over the Internet via peer to peer distribution of electronic health records,” J. Med. Syst., vol. 30, no. 2, pp. 107–116, 2006.
Fig. 8: Realization of uT-pointer pointing service.
The hand-motion tracking service aids operating 3-dimensional medical contents. As shown in Fig. 9, Dr. Seoul can show various aspects of a 3-dimensional medical content to Dr. Jeju with hand-motion tracking service. It enables detailed discussion on the medical content between doctors. Figure 10 depicts realization of hand-motion tracking service. It shows operating a 3-D skull image on the tiled display and sharing the operation with remote participants.
Fig. 9. Hand-motion tracking service (Cyber Glove).
Fig. 10. Realization of hand-motion tracking service.
148