Journal of Critical Care (2010) xx, xxx–xxx
Toward optimal display of physiologic status in critical care: I. Recreating bedside displays from archived physiologic data☆,☆☆ Anton Burykin PhD a,b,⁎, Tyler Peck BA b , Vladimir Krejci MD d , Andrea Vannucci MD c , Ivan Kangrga MD, PhD c , Timothy G. Buchman PhD, MD a,b a
Emory Center for Critical Care (ECCC) and Department of Surgery, School of Medicine, Emory University, Atlanta, GA 30322, USA b Department of Surgery, School of Medicine Washington University in St Louis, St Louis, MO 63110, USA c Department of Anesthesiology, School of Medicine Washington University in St Louis, St Louis, MO 63110, USA d Department of Anesthesiology, University Hospital of Bern, CH-3010 Bern, Switzerland
Keywords: Data display; Dynamic visualization; Scientific visualization; Patient monitoring; Visualization of physiologic signals; Medical education
Abstract Background: Physiologic data display is essential to decision making in critical care. Current displays echo first-generation hemodynamic monitors dating to the 1970s and have not kept pace with new insights into physiology or the needs of clinicians who must make progressively more complex decisions about their patients. The effectiveness of any redesign must be tested before deployment. Tools that compare current displays with novel presentations of processed physiologic data are required. Regenerating conventional physiologic displays from archived physiologic data is an essential first step. Objectives: The purposes of the study were to (1) describe the SSSI (single sensor single indicator) paradigm that is currently used for physiologic signal displays, (2) identify and discuss possible extensions and enhancements of the SSSI paradigm, and (3) develop a general approach and a software prototype to construct such “extended SSSI displays” from raw data. Results: We present Multi Wave Animator (MWA) framework—a set of open source MATLAB (MathWorks, Inc., Natick, MA, USA) scripts aimed to create dynamic visualizations (eg, video files in AVI format) of patient vital signs recorded from bedside (intensive care unit or operating room) monitors. Multi Wave Animator creates animations in which vital signs are displayed to mimic their appearance on current bedside monitors. The source code of MWA is freely available online together with a detailed tutorial and sample data sets. © 2010 Elsevier Inc. All rights reserved.
Abbreviations GUI, graphical user interface; HRV, Heart Rate Variability; ICU, Intensive Care Unit; MODS, multiple organ dysfunction syndrome; MWA, Multi Wave Animator; OR, Operating Room; SICU, Surgery Intensive Care Unit; SSSI, single sensor single indicator
☆ Financial support: This work was generously supported by grants from the James S. McDonnell Foundation (220020070) and Defense Advanced Research Project Agency (DARPA) (49533-LS-DRP and HR0011-05-1-0057). ☆☆ Conflict of Interest: none declared ⁎ Corresponding author. Department of Surgery, Emory University, Atlanta, GA 30322, USA. Tel.: +1 314 761 5422. E-mail addresses:
[email protected]; www.burykin.com (A. Burykin).
0883-9441/$ – see front matter © 2010 Elsevier Inc. All rights reserved. doi:10.1016/j.jcrc.2010.06.013
2
1. Introduction Bedside presentation of physiologic data is central to modern critical care. Current-generation bedside monitors are designed according to the “single sensor single indicator” (SSSI) display paradigm [1]. That is, a single indicator is displayed separately for each individual sensor connected to the patient. Waveform displays (such as electrocardiogram [ECG]) and simple time-averaged data (such as heart rate) have been the basis for clinical decision making for at least 4 decades. However, whether these displays provide the optimal data synthesis is still unknown. As computer storage has become less expensive, waveform data have started to become archived in numerous databases such as Multiparameter Intelligent Monitoring in Intensive Care (MIMIC) [2], MIMIC-II [3] (most of these data are freely available on Physionet, see Moody et al [4] for details), IMPROVE (improving control of patient status in critical care) [5], IBIS (Improved monitoring of brain function in intensive care and surgery) [6], and Complex Systems Laboratory (CSL) [7]. The number of patients and records in each vary from a few dozen to several thousands, and the length of records varies from a few hours to several days (for a review see Korhonen et al [8], for an approach using modern health Information Technology (IT) standards, see Eklund et al [9]). We, too, have collected and archived physiology data for postprocessing and analysis (see Burykin and Buchman [10] and Lu et al [11] for details). Special computer programs are required to visually display these signals. Although many of these programs are freely available (eg, on Physionet [12]), significant effort may be required for a clinician to learn how to use them (because many of these programs work only under Unix-like environments). Moreover, the displays do not emulate bedside monitors. This shortcoming compromises development and comparison of new displays with current technologies. The technology of recording and archiving physiologic signals has advanced faster than the technology of playing them back, which has attracted only a limited attention (see, eg, Kreuzer et al [13] and Stockmanns et al [14]). To address the technology gap, we developed the Multi Wave Animator (MWA), a set of open source MATLAB scripts that allows one to create animations (eg, AVI video files) of recorded signals and thus present clinically relevant information in a format familiar to clinicians. It can also be used for construction and testing of novel, advanced types of physiologic signal display (before new display algorithms will be hard coded into actual bedside monitors). The paper is organized as follows. In the next section, we describe the displays of typical SSSI bedside monitors and provide an overview of the archiving procedure. Then, in Section 3, we provide the detailed description of the MWA software as well as the process of animation creation that MWA uses. Then, in Section 4, we discuss several advanced features (such as heart rate variability [HRV] indices, indices
A. Burykin et al. of cardiorespiratory synchronization, or vital sign sonification) that can be incorporated through MWA into advanced displays. We also outline current limitations and future development of the software. Finally, we provide information regarding MWA availability.
2. Display and recording of patient vital signs 2.1. Display of a typical bedside monitor The patient data displayed by a bedside monitor belong to 2 distinct types: waveforms (such as ECG) that are renewed “continuously” and numeric data (such as heart and respiration rates), which are renewed only, for example, once per second. Generally, there are 2 ways how waveform dynamics can be displayed on the monitor, sometimes called a steady trend line and a moving trend line [15]. In the first case, a waveform starts at the left border of the screen and proceeds with time to the right border. Once the waveform has reached the right border, it returns back to the left border and begins to overwrite previous displayed values with the new ones. In the second case, a waveform starts at the right border and constantly proceeds to the left border, where it drops off the screen. Moreover, frequently timescales of the displays waveforms are not the same despite the “continuous sweep” across the screen: intervals of different length are displayed on the monitor screen for different types (eg, hemodynamic and respiratory) of waveforms. Such subtleties, although transparent to most clinicians, profoundly affect data display design.
2.2. Data recording Bedside data are typically displayed in real time. Archiving is another matter. Depending on the hardware and software combination used for vital sign recording, different sampling frequencies are observed in different databases. Waveforms are usually recorded at sampling frequencies between 62.5 and 500 Hz (within one data set, different sampling frequencies are used for different waveforms), and the numeric data are typically recorded at the sampling frequencies between 1 and 0.0167 Hz (from once per second to once per minute). Usually (but surprisingly not always), the sampling frequency is the same for all numeric data channels. In some cases, monitor alarms and alerts are also recorded. The challenge of recreating a data display is apparent. We need to mention a typical problem that can occur during the animation of multichannel recordings. Namely, if too many signals were recorded simultaneously, gaps in the recorded signals (both in waveforms and in numeric data) can appear because of the fixed and limited transmission capacity of the monitor. Thus, the sampling frequencies cannot be assumed to be constant through the recording. Because these gaps occur randomly at different time moments for different signals, the signals become unsynchronized (this desynchronization can
Recreating bedsise displays from archived physiologic data occur both between waveforms and numeric data and among different waveforms) if displayed together under the assumption of constant sampling frequencies. Depending on the particular archiving system, during an hour-long recording, gaps can cause a delay of about 10 seconds between waveforms and numeric data (eg, between continuous arterial blood pressure (ABP) waveform and corresponding systolic and diastolic numeric values). This artifact has substantial potential to confuse and confound and must therefore be corrected.
3 Windows XP OS (both 32- and 64-bit versions) (Apple Inc., Cupertino, CA, USA). It is expected to run under other supported platforms and newer (and possibly also earlier) versions of MATLAB. A flowchart of the process of movie creation with the MWA is shown in Fig. 1. The MWA components (signal reader, frame generator, and movie generator) are described below. A detailed step-by-step tutorial with an example run is available on the MWA Web page.
3.1. Signal reader
3. Multi Wave Animator framework description The Multi Wave Animator has a modular structure and consists of 3 MATLAB scripts (signal reader, frame generator, and movie generator) that are applied sequentially. The choice of MATLAB (www.mathworks.com) reflects its emergence as a standard software for biomedical signal processing and visualization, which is available for all major OS (MS Windows, Mac OS, and Linux [Linux Kernel Organization, Inc., San Francisco, CA, USA]). Multi Wave Animator was developed and tested with MATLAB release 2007b (Microsoft, Redmond, WA, USA) running on Dell workstations (Dell Inc., Round Rock, TX, USA) under MS
The input data for the signal reader is prepared in text (CSV), comma-separated values format. Although different databases and waveform archives use different internal (mostly binary and sometimes also proprietary) formats for storage, utilities are provided with commercial archiving systems so the data can always be exported into the text format. Signal reader can therefore work with signals from any data library. Moreover, if the data are stored in a relational database, they can be imported directly into MATLAB (using MATLAB Database Toolbox) without the need of intermediate text files. In this case, signal reader code can be further simplified. Because MWA has a modular structure, no other modules have to be altered.
Fig. 1 Conceptual framework and the flowchart of the movie creation procedure using the MWA. The inset at the bottom shows the 3 modules of MWA: signal reader, frame generator, and movie generator (see Section 3 for details). The content of the box “advanced features” is discussed in the Section 4.
4
A. Burykin et al.
To overcome the problem related to different sampling frequencies and possible gaps in the data, all signals (both waveforms and numeric data) are interpolated (not resampled) by the signal reader at a common sampling frequency (we usually use 100 Hz). The user is required to input the value of the sampling frequency, select the fragment of the data set to be animated, and choose a range (minimum and maximum limits of the vertical axis) for each waveform. If values of the waveform exceed the limits, they are “clipped” (that is, replaced by the constant that is equal to the minimum or the maximum limit) to be within a range, just as happens on physical monitors. Alternatively, the user can choose “autoscaling” for some waveforms (or for all of them). In this case, the vertical axis limits will be adjusted dynamically during the animation to the minimum and maximum of the displayed interval of the waveform. Finally, all signals together with the entered parameters are written by the signal reader to the single binary MATLAB file called signals.mat.
3.2. Frame generator The frame generator script is responsible for the creation of individual movie frames. First, it reads the signals and previously defined parameters from the signals.mat file. Then the user specifies the exact location (relative coordinates) of every signal on the screen (frame) and time interval (s) to display for every waveform and the frame rate. We found that a frame rate of 30 frames per second is appropriate for a visually pleasing animation1. Then the frame generator creates individual frames one by one (in a loop) and writes them to the hard drive. The loop counter (defined by the frame rate) determines the relative shift of the waveforms between consecutive frames. The fact that all signals have the same sampling frequency allows us to use only one parameter (time shift that is inversely proportional to the frame rate) that determines the shift of all waveforms and the renewal of all numeric vital signs at every frame. A typical frame created by the frame generator is shown in Fig. E1a (online supplemental material). It is possible to define different time intervals for different waveforms displayed as is common at many bedsides where the respiratory traces are “compressed” (see Fig. E1b). However, in this case, different time shifts must be used for different signals. All frames have an extra space at the bottom for subtitles (which are added Our goal was to “imitate” physiologic waveform display by a real bedside monitor. Thus, the movie is animated “in real time” (ie, it takes 1 minute to animate 1 minute of the recorded data). So, our optimal frame rate (30 frames per second) corresponds to the “real-time” animation. However, most movie players have a “fast forward” option that controls the playback speed, so it is possible to play a long movie faster, for examples, to find a clinically interesting fragment. At a higher playback speed, the frame rate of 30 frames per second is actually higher than necessary. Our animations are optimized for “real-time” viewing. Also, because our goal was to create animations playable by any standard movie players, it was not possible to implement variable (speed-depended) frame rate. 1
during movie postprocessing) that can contain any explanatory information (eg, “stable hemodynamics”). This space can also be used to display bedside monitor alarm messages, if they were recorded together with vital signs. All frames also have a “timer” (at the right bottom corner) that displays the time since the beginning of the movie (timer is based on the sampling frequency). Every frame is saved to the hard drive as a file in MATLAB format (“.dat”) and additionally as a bitmap (“.bmp”) file (or any other graphical file format supported by MATLAB on a particular OS). This redundancy allows the user to visually monitor the movie creation process (because bmp files can be previewed, eg, with built-in MS Windows Picture Viewer). Another reason the individual frames may be needed as high-quality bmp files is because the user may need different formats of animations other than an AVI movie file (eg, animated GIF or SWF/ Flash files). These files can be created directly from a sequence of bmp files and have relatively small sizes, so they can be easily included in a webpage or a Power Point presentation. The AVI movie file can also be created directly from the individual bmp images using most standard video-editing programs (see the Section 3.4 below about postprocessing). This option (which can easily be disabled), however, has a significant drawback in the large size of a typical bmp file. For example, the size of a bmp file of a single movie frame (Fig. E1) is about 3 MB, so a 10-second movie at 30 frames per second generates 300 bmp files with the total size about 1 GB. Thus, if bmp files are generated, a relatively large scratch (temporary) disk is required. On the basis of our experience, we elected to save individual frames and to separate the process of frame generation from the process of movie generation also because of the following reasons (in addition to those discussed above). First, we found that unless system is managed efficiently MATLAB processing can slow down significantly: frame creation time can increase 5-fold after the first 100 frames. Thus, we chose to automatically terminate and restart the frame generation process every 50 frames. Second, at any given time only one frame is stored in the memory; thus, the memory is not a limiting factor even for a very long movie (we run the MWA scripts on a desktop computer with 4 GB of RAM. The maximum amount of RAM occupied during the run was about 150 MB). Also, if MATLAB or the OS crashes in the middle of a long movie creation, no results (frames) are lost, and the program can be restarted from the frame at which it crashed. Finally, with multiple processors increasing common in desktop computing, a long data set can be split into multiple fragments, and multiple instances of the frame generator can run in parallel. This can be done using either several computers or a single computer with multiple processors (or cores) and multiple monitors, so that every MATLAB instance runs on its own processor (or a core) and frames for each movie fragment are displayed on their
Recreating bedsise displays from archived physiologic data own monitor. Thus, the time of frame generation scales roughly linearly (it is proportional to the number of computers used). Parallel runs of the frame generator can significantly speed up the frame generation process, which is the most time-consuming part of the animation creation.
3.3. Movie generator Movie generator is a short and very simple MATLAB script that sequentially loads binary files (“.dat”) for individual frames and creates a movie file in AVI format. It uses frame rate and the codec name as input parameters.
3.4. Postprocessing Once created, the movie can be edited using common video editing software including either free, for example, MS Movie Maker, Virtual Dub (under Windows OS), iMovie (under Mac OS), or commercial programs, such as Adobe Premiere. Editing can include adding annotations (extra frames with a movie title and interrogatory and/or explanatory text and subtitles) and audio (eg, narrations). The movie can also be mixed with other video fragments, for example, a short video lecture of a clinician that presents the case, or with a video recording of the surgical procedure, if the vital signals were recorded during an operation, which was simultaneously videotaped (see, eg, Kanani et al [16]).
4. Advanced features The Multi Wave Animator can be used to create interactive2 dynamic displays that simply replicate bedside monitor display. However, the modular structure of MWA (Fig. 1) facilitates contractions of virtual displays that also include advanced features (some of them go beyond the current SSSI paradigm). These features are discussed below.
4.1. Vital sign variability Movies of virtual displays can include new signals (in both “numeric” and “waveform” formats) that are not displayed by conventional bedside monitors but are derived 2 Animation video is an interactive type of display because the user can stop, replay, reverse, or change speed and view and review different fragments of the movie in any sequence (for a formal discussion from the educational psychology and cognitive science point of view see, for example, Hegarty [17 and references therein]). We acknowledge that this is still a very limited interactivity (as compared to those that can be achieved with hypervideo or virtual reality technologies), but this is the maximum level of interactivity that can be achieved with a simple movie format (eg, AVI). Such a simple format is required because our goal was to create a movie that can be played by anyone using a regular desktop or laptop computer and virtually any media player without the need for any special software or hardware.
5 from the “raw” signals (recorded vital signs). These “extra” signals may include, for example, indices of HRV [18] or variability (“complexity”) indices of other vital signs (free software for variability analysis can be found online, eg, on Physionet). In this way, movies of conventional and virtual novel displays can be created and played back in parallel to assess the use of presentations before customized software and hardware is ever designed.
4.2. Organ-organ interconnection In a similar way, MWA can also display indices that measure interconnections between organ systems and thus overcome the conceptual limitation of the traditional SSSItype monitors. Because every waveform represents a “state” or “dynamics” of a single organ (or organ system), there are no signals displayed on a conventional bedside monitor that would represent a degree of organ-organ or multiorgan connectivity (eg, indices of cardiorespiratory synchronization [19]) in real time. The importance of organ-organ interactions monitoring in intensive care unit (ICU) (especially for multiple organ dysfunction syndrome patients) is discussed in [20]. This is also relevant to the animation of the recorded vital signs because, for example, IBIS database contains vital signs of multiple organ dysfunction syndrome patients. Indices of cardiorespiratory coupling calculated from vital signs recorded from ICU monitor are discussed in Burykin and Buchman [10]. Display of these coupling indices in addition to the vital signs can be viewed as a “compromise” between traditional SSSI-type displays and recently introduced advanced integrated (graphical or ecological) displays [21-23] (such monitors do not display waveforms but graphically represent organ systems as, eg, 2-dimensional or 3-dimensional objects and organ dynamics as changes in object size or shape). In our case, some interrelationships among different vital signs can be displayed, whereas all vital signs are still displayed in the traditional, familiar “waveform and numeric data” format. With this extra capacity implemented in animations, the dynamics of both of the vital signs and their variability and coupling indices can be displayed and viewed at the same time on the same screen.
4.3. Mathematical models Along with animations of the results of the purely datadriven analysis of the vital signs (variability indices), MWA can also create movies with additional data channels that are the numerical solution (output) of a mathematical model that uses recorded vital signs as an input (see, eg, Kennedy et al [24] and Zenker et al [25]). For a movie observer, such a solution will be displayed effectively “in real time” together with the original waveforms (although numerical simulations are performed off-line because they usually require a significant amount of computer time and power).
6
4.4. Sonification Another opportunity to enhance MWA animations is to supplement and expand a visual display with an audio signal (sonification or auditory display of the vital signs [26]). Audio representation of pulse oximetry signal with variable tone is widely used in clinical practice in addition to its visual display on operating room (OR) bedside monitors [27]. It has been suggested that other vital signs could be sonified as well. Several experimental auditory displays that combine a traditional visual bedside monitor with sonification of multiple vital signs have been designed and successfully tested in laboratory environments (see Sanderson et al [21] for a review and Loeb and Fitch [28] and Sanderson et al [29] for details and for downloadable movie files with examples of simultaneous visualization and sonification of simulated waveforms). Sonification has been applied to the study of single channel (HRV [30]) and multichannel (several electroencephalogram [EEG] channels [31] and EEG, electrooculography (EOG), and ECG [32]) waveform dynamics. Simulation studies have shown that sonification can be used to detect (possibly time-dependent) coupling and synchronizations in weakly coupled oscillator models [33,34]. Thus, sonification may also be used to detect organ-organ interactions and, if included into the movie together with animation, it may help to integrate traditional SSSI-type representation of vital signs into a holistic picture of a physiologic state (or dynamics) of the organism.
4.5. Alarm sounds Another class of audio signals that can be added to the animation are alarm sounds. Although many archiving software (eg, BedMasterEx, www.excel-medical.com) can capture alarm messages and many publicly available databases (eg, MIMIC-II) contain times of alarm events and messages (for specific alarm-focused data collections see, eg, Zhang et al [35] and Siebig et al [36]), none currently record or store actual alarm sounds. Recently, however, an alarm sound database and simulator software became freely available [37]. Alarm sounds can be taken from this database and added to the video animation (during the postprocessing) at indicated times. It is also possible to develop new alarms by applying recently proposed novel algorithms (see, eg, Zong et al [38] and Clifford et al [39]) to the recorded vital signs and then present these new sounds along with vital sign animation for better testing and comparison. In summary, using the advanced features described above, one can create a “virtual (or augmented) bedside monitor.” Such displays may enhance our understanding of the physiologic dynamics while at the same time they maintain the format of the display as close as possible to the real bedside monitor. We emphasize that MWA was designed so that none of these advanced features require any modification of its code or the movie creation procedure (see Fig. 1). That is, as long as additional signal is prepared in
A. Burykin et al. the same format as the “raw” data sets (see the signal reader section), it will be displayed and animated by the MWA as just one more waveform (or a numeric signal). Vital sign sonification must be done separately (software for signal sonification is freely available online, see, eg, Olivan et al [32]), and the resulting audio files (wav or mp3) can be added to the movie during the postprocessing stage using any movie editing software (see above) as an extra audio channel.
5. Possible applications Animations made with the MWA can be used as an “offline emulator” of a bedside monitor for teaching, research, and quality control purposes:
5.1. Medical education Vital signs that correspond to clinically interesting events, when they occur in the OR or ICU, can be recorded and then animated. Movies may be used for medical education to demonstrate the vital sign dynamics, for example, at the onset of a particular complication and in response to a particular intervention3. These movies can be made to achieve multiple educational goals. For example, 2 versions of the same movie can be created, one fully commented and annotated, and another one containing only the “raw” vital sign animation. The fist can be used for teaching and the other for testing of medical students and residents.
5.2. Clinical research Vital sign animations (especially those created with the use of advanced features) can be used to study system-level physiologic dynamics of the organism at different conditions and under the influence of various clinical procedures. Also, MWA can create video of only synthetic (simulated) signals using a display identical to those of a typical bedside monitor. This may be used to represent results of clinically relevant mathematical models (see, eg, Kennedy et al [24] and Tham and Sasse [40]) in a format familiar to clinicians. A familiar representation may help clinicians to better assess and evaluate the behavior of such models. Finally, MWA can also be used to develop and test new audiovisual formats (representations) of vital signs (new monitoring frameworks and display types). For example, it is important to test whether sonification (presented via auditory display) can better characterize organ-organ interactions (eg, synchronization) than the synchronization 3 Currently used simulators (training manikins) have several important limitations. First, they use artificially generated signals (ie, ECG, arterial blood pressure [ABP]). Second, they always assume the same physiological patterns, for example, the ABP goes down therefore the heart rate goes up. It is known that this is not always true and that reality is much more complex. So, having the real physiologic waveforms and sequence of real events would be an important additional training tool.
Recreating bedsise displays from archived physiologic data indices displayed visually. Videos created with MWA may eventually help us to understand how to monitor and display the dynamics of a complex system (critically ill patient) and its responses to an external perturbation (eg, a surgical procedure).
5.3. Quality control and patient safety Audiotaping and videotaping are frequently used during surgical procedures (see, eg, Kanani et al [16]) and as tools for clinical team performance evaluation (for a review, see, eg, Jeffcott et al [41]). It is sometimes possible to simultaneously record patient vital signs. We believe that the retrospective analysis of vital sign dynamics, displayed in a format that mimics a real bedside monitor and synchronized with the audio-video stream may enhance our understanding of clinical team actions. Possibly, vital sign animations may become a part of an electronic patient record in a hospital data system.
6. Comparisons with existing tools Most of the currently available solutions have some limitations. The EEG player [13] is a hardware-based solution, which requires actual EEG monitors to display the signals. Also, it works only with EEG (waveforms and derived numeric) vital signs. Arbiter [42] is only a front end to anesthesia simulators (BODY [Advanced Simulation Corporation, Point Roberts, WA, USA] and METI ECS [Medical Education Technologies, Inc., Sarasota, FL, USA]). To the best of our knowledge, NeuMonD [14] is probably the only software that implements dynamic visualization of both raw vital signs as well as extendable set of advanced features (such as complexity or variability indices), although at present, it focuses mainly on brain dynamics display.
7 Future work includes development of a graphical user interface for a fully visual creation of animations that will eliminate the need to manually edit MATLAB scripts. The users will be able to visually select data channels they want to include in the animation and also visually define the locations of the waveforms and numeric data on the screen (user configurable display). Also, currently, only the “moving trend line” waveform motion is implemented. The future version will include the “steady trend line” option as well.
8. Availability Our discussions with other research groups that work with vital sign data make it apparent that there is a need for software with this functionality. Thus, we decided to make the current version of MWA available for further use and development. Moreover, to stimulate its future development, we decided to make MWA an open source project. The complete set of MATLAB scripts is freely available for download from the following URL: www.burykin.com/ mwa/. It will also be uploaded to Physionet (www.physionet. org), as well as the MATLAB Central File Exchange (www. mathworks.com/matlabcentral/fileexchange/). The ZIP file contains the full set of MATLAB scripts, as well as a detailed tutorial that describes step-by-step how to use the scripts to create an animation. It also contains sample input “.csv” data files (20-second fragment of vital signs recorded from an OR bedside monitor), all the intermediate results (signals.mat file and “.dat” and “.bmp” files with several individual frames), and the final “.avi” movie file, so the user can verify the results of every step of the tutorial. The scripts are extensively annotated and commented, so it is straightforward for the users to modify them and create new animations of their own data.
7. Limitations of the MWA framework and future development 9. Conclusions Microwave animator is still under development. In its current iteration, MWA is a working prototype rather than a user-friendly software product. Initially, MWA was developed as an internal tool for our group to be used in batch mode, so no graphical user interface was required or built. Thus, any modifications in the current version must be done directly in the program code, so some elementary knowledge of MATLAB programming language is required (we discuss the details of our implementation in the appendix). This also means that MATLAB (which is commercial software) is required to modify and run the scripts. However, only the MATLAB base package is needed, and no additional toolboxes are required. Moreover, although the creation of an animation with MWA definitely requires some level of computer proficiency, once created, the movie can be played by anyone using a regular desktop or laptop computer and virtually any media player.
We described current physiologic signal displays within the SSSI paradigm, considered possible extensions and enhancements within the SSSI paradigm, and developed a software framework to reconstruct current and future generation displays from raw data. The MWA framework remains a working prototype (proof of principle); however, it can readily be used (alone as well as together with other software tools mentioned in this paper) to construct and experiment with different types of physiologic signal display within extended SSSI paradigm. This paper reports the first phase of a larger project that focuses on optimizing display of physiologic status in critical care. This report focused on conventional displays of physiologic signals and their possible extensions within traditional SSSI paradigm. The next report will go beyond the SSSI paradigm and deal with
8 alternative data displays (such as displays based on complex systems paradigm).
Acknowledgments Authors would like to thank Drs Madalena D. Costa, Phyllis L. Stein, and Eizo Watanabe for useful discussions.
Appendix A. Supplementary data Supplementary data associated with this article can be found, in the online version, at doi:10.1016/j.jcrc.2010.06.013.
References [1] Goodstein LP. Discriminative display support for process operators. In: Rasmussen J, Rouse WB, editors. Human detection and diagnosis of system failure. New York: Plenum; 1981. p. 433-49. [2] Moody GB, Mark RG. A database to support development and evaluation of intelligent intensive care monitoring. Comput Cardiol 1996:657-60. [3] Saeed M, Lieu C, Raber G, Mark RG. MIMIC II: a massive temporal ICU patient database to support research in intelligent patient monitoring. Comput Cardiol 2002:641-4. [4] Moody GB, Mark RG, Goldberger AL. PhysioNet: a Web-based resource for the study of physiologic signals. Eng Med Biol Mag, IEEE 2001;20(3):70-5. [5] Nieminen, K., R.M. Langford, C.J. Morgan, J. Takala, A. Kari, A clinical description of the improve data library. IEEE Eng Med Biol Mag 1997:16(6);21-24, 40. [6] Thomsen CE, Cluitmans L, Lipping T. Exploring the IBIS data library contents: tools for data visualisation, (pre-) processing and screening. Comput Methods Programs Biomed 2000;63(3):187-201. [7] Goldstein B, McNames J, McDonald BA, Ellenby M, Lai S, Sun Z, et al. Physiologic data acquisition system and database for the study of disease dynamics in the intensive care unit. Crit Care Med 2003;31(2): 433-41. [8] Korhonen I, van Gils M, Gade J. The challenges in creating critical care databases. IEEE Eng Med Biol Mag 2001;20(3):58-62. [9] Eklund JM, McGregor C, Smith KP. A method for physiological data transmission and archiving to support the service of critical care using DICOM and HL7. Conf Proc IEEE Eng Med Biol Soc 2008;2008: 1486-9. [10] Burykin A, Buchman TG. Cardiorespiratory dynamics during transitions between mechanical and spontaneous ventilation in intensive care. Complexity 2008;13(6):40-59. [11] Lu Y, Burykin A, Deem MW, Buchman TG. Predicting clinical physiology: a Markov chain model of heart rate recovery after spontaneous breathing trials in mechanically ventilated patients. J Crit Care 2009;24(3):347-61. [12] Goldberger AL, Amaral LAN, Glass L, Hausdorff JM, Ivanov PC, Mark RG, et al. PhysioBank, PhysioToolkit, and PhysioNet— components of a new research resource for complex physiologic signals. Circulation 2000;101(23):E215-20. [13] Kreuzer M, Kochs EF, Pilge S, Stockmanns G, Schneider G. Construction of the electroencephalogram player: a device to present electroencephalogram data to electroencephalogram-based anesthesia monitors. Anesth Analg 2007;104(1):135-9. [14] Stockmanns G, Ningler M, Omerovic A, Kochs EF, Schneider G. NeuMonD: a tool for the development of new indicators of anaesthetic effect. Biomed Tech (Berl) 2007;52(1):96-101.
A. Burykin et al. [15] Steimann F., Diagnostic Monitoring of Clinical Time Series (Doctoral Dissertation). 1995, Technical University of Vienna, Austria. [16] Kanani M, Kocyildirim E, Cohen G, Bentham K, Elliott MJ. Method and value of digital recording of operations for congenital heart disease. Ann Thorac Surg 2004;78(6):2146-9. [17] Hegarty M. Dynamic visualizations and learning: getting to the difficult questions. Learn Instr 2004;14(3):343-51. [18] Camm AJ, Malik M, Bigger JT, Breithardt G, Cerutti S, Cohen RJ, et al. Heart rate variability—standards of measurement, physiological interpretation, and clinical use. Circulation 1996;93(5):1043-65. [19] Schafer C, Rosenblum MG, Kurths J, Abel HH. Heartbeat synchronized with ventilation. Nature 1998;392(6673):239-40. [20] Godin PJ, Buchman TG. Uncoupling of biological oscillators: a complementary hypothesis concerning the pathogenesis of multiple organ dysfunction syndrome. Crit Care Med 1996;24(7):1107-16. [21] Sanderson PM, Watson MO, Russell WJ. Advanced patient monitoring displays: tools for continuous informing. Anesth Analg 2005;101(1): 161-8. [22] Drews FA, Westenskow DR. The right picture is worth a thousand numbers: data displays in anesthesia. Human Factors: The Journal of the Human Factors and Ergonomics Society 2006;48(1):59-71. [23] Effken JA, Loeb RG, Kang Y, Lin ZC. Clinical information displays to improve ICU outcomes. Int J Med Inform 2008;77(11):765-77. [24] Kennedy RR, French RA, Gilles S. The effect of a model-based predictive display on the control of end-tidal sevoflurane concentrations during low-flow anesthesia. Anesth Analg 2004;99(4):1159-63. [25] Zenker S, Rubin J, Clermont G. From inverse problems in mathematical physiology to quantitative differential diagnoses. PLoS Comput Biol 2007;3(11):e204. [26] Kramer G. Auditory display: sonification, audification, and auditory interfaces. Santa fe institute studies in the sciences of complexity proceedings, Vol. XVIII. Reading, MA, Addison Wesley; 1994. [27] Santamore DC, Cleaver TG. The sounds of saturation. J Clin Monitor Comput 2004;18(2):89-92. [28] Loeb RG, Fitch WT. A laboratory evaluation of an auditory display designed to enhance intraoperative monitoring. Anesth Analg 2002;94 (2):362-8. [29] Sanderson PM, Watson MO, Russell WJ, Jenkins S, Liu D, Green N, et al. Advanced auditory displays and head-mounted displays: advantages and disadvantages for monitoring by the distracted anesthesiologist. Anesth Analg 2008;106(6):1787-97. [30] Ballora M, Pennycook B, Ivanov PC, Goldberger A, Glass L. Detection of obstructive sleep apnea through auditory display of heart rate variability. Comput Cardiol 2000:2000. [31] Baier G, Hermann T, Stephani U. Multi-channel sonification of human EEG. Proc Intl Conf Auditory Display (ICAD). In: Scavone GP, editor. Proceedings of the 13th International Conference on Auditory Display (ICAD2007), June 26-29, 2007. Montreal, Canada: Schulich School of Music, McGill University; 2007: 491-6. [32] Olivan J, Kemp B, Roessen M. Easy listening to sleep recordings: tools and examples. Sleep Med 2004;5(6):601-3. [33] Baier G, Hermann T, Muller M. Polyrhythmic organization of coupled nonlinear oscillators. Proceedings of the 9th International Conference on Information Visualisation. Los Alamitos, CA, USA: IEEE Computer Society; 2005. [34] Baier G, Hermann T, Lara OM, Muller M. Using sonification to detect weak cross-correlations in coupled excitable systems. Proceedings of the International Conference on Auditory Display (ICAD 2005). Limerick, Ireland: International Community for Auditory Display; 2005. [35] Zhang Y, Silvers CT, Randolph AG. Real-time evaluation of patient monitoring algorithms for critical care at the bedside. Conf Proc IEEE Eng Med Biol Soc 2007;2007:2783-6. [36] Siebig, S., S. Kuhls, M. Imhoff, J. Langgartner, M. Reng, J. Scholmerich, et al, Collection of annotated data in a clinical validation study for alarm algorithms in intensive care—a methodologic framework. J Crit Care, 2009 (in press).
Recreating bedsise displays from archived physiologic data [37] Takeuchi A, Hirose M, Shinbo T, Imai M, Mamorita N, Ikeda N. Development of an alarm sound database and simulator. J Clin Monit Comput 2006;20(5):317-27. [38] Zong W, Moody GB, Mark RG. Reduction of false arterial blood pressure alarms using signal quality assessment and relationships between the electrocardiogram and arterial blood pressure. Med Biol Eng Comput 2004;42(5):698-706. [39] Clifford GD, Aboukhalil A, Sun J, Zong W, Janz BA, Moody G, et al. Using the blood pressure waveform to reduce critical false ECG alarms. Comput Cardiol 2006;33:829-32.
9 [40] Tham RQY, Sasse FJ, Rideout VC. Large-scale multiple model for the simulation of anesthesia. In: Moller D, editor. Advanced simulation in medicine. New York: Springer; 1989. p. 173-93. [41] Jeffcott SA, Mackenzie CF. Measuring team performance in healthcare: review of research and implications for patient safety. J Crit Care 2008;23(2):188-96. [42] Watson M, Sanderson P, Lacherez P, Trentini M, Purtill T. Arbiter— a simulator for the design and evaluation of patient monitoring displays. Abstract for SimTect Healthcare Simulation Conference; 2005.