Apr 17, 2014 - Also presented is a detailed account of the developmental process, an ...... Xcode 5 (Mac code editor) and external libraries,openFrameworks ...
Interactive Data Visualization Installation on Suicide in Ireland
Nicola Cosgrove 10002870 Faculty of Science and Engineering Department of Computer Science & Information Systems University of Limerick
BSc in Music and Media Performance Technology April 17th 2014
1. Supervisor: Dr. Giuseppe Torre Music and Media Performance Technology University of Limerick Ireland
Supervisor’s signature:
2. Second Reader: Mr. Leon Mc Carthy Music and Media Performance Technology University of Limerick Ireland
Second Reader’s signature:
ii
Abstract
This paper outlines the development of an interactive data visualization on Suicide in Ireland within an installation setting. The Leap Motion 3D Gestural controller is used as the means for a visitor to the installation to explore and interact with the data. A research overview is provided on the two core components of an interactive data visualization, representation and interaction. A background to the issue of suicide in Ireland is provided. Furthermore an overview of the technologies and software used to develop the project is included. Also presented is a detailed account of the developmental process, an evaluation of the installation and 3D gestural control for interactive data visualizations and in conclusion future possible developments and research is provided.
Declaration
I herewith declare that I have produced this paper without the prohibited assistance of third parties and without making use of aids other than those specified; notions taken over directly or indirectly from other sources have been identified as such. This paper has not previously been presented in identical or similar form to any other Irish or foreign examination board. The work was conducted under the supervision of Dr.Giuseppe Torre at the University of Limerick.
Limerick, 2014
Acknowledgements
I would like to firstly thank my supervisor Dr. Giuseppe Torre for his guidance, enthusiasm and help throughout this project. Much of the staff at the Computer Science and Information Systems department have had a great impact on my studies for the four years of this degree. Their enthusiasm has had a great influence on me and encouraged me to be bold, inquisitive and inventive in my pursuit of knowledge. Many thanks to my classmates of MMPT class of 2014 for their friendship, rallying each other on when we thought we’d never make it to the end and for their support throughout. Endless gratitude is extended to my partner Barry for his constant reassurance and love and also to my twin Elaine for her encouragement.
Dedicated to the memory of those lives lost to suicide and to the strength and courage of those affected in the wake.
Contents List of Figures
v
1 Introduction
1
1.1 1.2
Aims and Objectives . . . . . . . . . . . . . . . . . . . . . . . . . Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3 4
1.3
Report Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5
2 Research
6
2.1
Data Visualization . . . . . . . . . . . . . . . . . . . . . . . . . . 2.1.1 A Brief History . . . . . . . . . . . . . . . . . . . . . . . .
6 6
2.2
Acquire- Parse - Filter - Mine . . . . . . . . . . . . . . . . . . . .
7
2.3
Designing Data Visualizations (Represent-Refine) . . . . . . . . .
9
2.4
2.3.1 Color and Perception . . . . . . . . . . . . . . . . . . . . . Interactivity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
10 12
2.4.1
Natural User Interfaces (NUIs) . . . . . . . . . . . . . . .
12
2.4.2
Hand Based Gesture Interaction . . . . . . . . . . . . . . .
13
2.5
2.4.3 Intersection of Natural User Iinterfaces & Data Visualization 15 Suicide in Ireland . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2.5.1
Sources of Data and its Classification for Suicide in Ireland
3 Overview of Software and Technologies
18 21
3.1
Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
21
3.2
3.1.1 Processing . . . . . . . . . . . . . . . . . . . . . . . . . . . Technologies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
21 22
3.2.1
23
Leap Motion . . . . . . . . . . . . . . . . . . . . . . . . . .
iii
CONTENTS
3.2.2
Onformative Leap Motion Library for Processing . . . . .
24
4 Development 4.1 Visualization of Data . . . . . . . . . . . . . . . . . . . . . . . . .
26 26
4.1.1
Acquire . . . . . . . . . . . . . . . . . . . . . . . . . . . .
26
4.1.2
Parse . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
27
4.1.3 4.1.4
Filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Mining . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
28 28
4.1.5
Representing . . . . . . . . . . . . . . . . . . . . . . . . .
29
4.1.5.1
Time Series Line Graph . . . . . . . . . . . . . .
29
4.1.5.2 Coxcomb Diagram . . . . . . . . . . . . . . . . . Refinement . . . . . . . . . . . . . . . . . . . . . . . . . .
32 33
Adding Interactivity with the Leap Motion . . . . . . . . . . . . .
33
4.2.1
Interact . . . . . . . . . . . . . . . . . . . . . . . . . . . .
34
Installation Environment . . . . . . . . . . . . . . . . . . . . . . .
36
5 Evaluation 5.1 Problems Encountered . . . . . . . . . . . . . . . . . . . . . . . .
39 39
4.1.6 4.2 4.3
5.2
3D Gestural Control & Data Visualization . . . . . . . . . . . . .
41
5.3
Installation Day . . . . . . . . . . . . . . . . . . . . . . . . . . . .
43
6 Conclusion 6.1
44
Future Developments . . . . . . . . . . . . . . . . . . . . . . . . .
References
46 47
Appendix A: Developmental Concept Sketches & Screenshots of Developmental Coding 54 Appendix B: Screenshots of Developed Data Representation & Interface 55 Appendix C: Installation Day Setup & Other
iv
58
List of Figures 3.1
Leap Motion Interaction Area . . . . . . . . . . . . . . . . . . . .
24
3.2
Leap Motion Coordinate System . . . . . . . . . . . . . . . . . . .
24
4.1
Time Series Bar Graph . . . . . . . . . . . . . . . . . . . . . . . .
30
4.2
Time Series Bar Graph Male . . . . . . . . . . . . . . . . . . . . .
30
4.3
Time Series Line Graph . . . . . . . . . . . . . . . . . . . . . . .
31
4.4
Line Graph Stacked Female& Male . . . . . . . . . . . . . . . . .
31
1
Change to Line Graph . . . . . . . . . . . . . . . . . . . . . . . .
52
2
Testing Pink Data Highlights . . . . . . . . . . . . . . . . . . . .
52
3 4
Circle Packing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Chord Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . .
52 52
5
Coxcomb Diagram Development . . . . . . . . . . . . . . . . . . .
53
6
Refining the Line Graph . . . . . . . . . . . . . . . . . . . . . . .
53
7 8
PseudoCode One . . . . . . . . . . . . . . . . . . . . . . . . . . . PseudoCode Two . . . . . . . . . . . . . . . . . . . . . . . . . . .
53 53
9
Pseudocode Three . . . . . . . . . . . . . . . . . . . . . . . . . . .
54
10
Pseudocode Four . . . . . . . . . . . . . . . . . . . . . . . . . . .
54
11 12
Ambient Screen . . . . . . . . . . . . . . . . . . . . . . . . . . . . Suicide Facts Screen . . . . . . . . . . . . . . . . . . . . . . . . .
55 55
13
Menu Screen . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
56
14
Line Graph Screen Male . . . . . . . . . . . . . . . . . . . . . . .
56
15 16
Line Graph Screen Female . . . . . . . . . . . . . . . . . . . . . . Line Graph Screen Both Genders with Reference Lines & Data
56
Highlight . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
57
v
LIST OF FIGURES
17
Line Graph Screen with Age Connection . . . . . . . . . . . . . .
18 19
Coxcomb Diagram of Social Economic Groups of Deceased 2001-2012 57 Installation Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
20
Closeup of Installation . . . . . . . . . . . . . . . . . . . . . . . .
59
21
Another Closeup of Installation . . . . . . . . . . . . . . . . . . .
60
22
Leap Motion Developer Forum Feedback . . . . . . . . . . . . . .
60
vi
57
1 Introduction Data visualization has a varied history from the cholera maps of Oxford by John Snow in 1855(Parkes, 2013), William Playfairs bar graphs and pie charts(Tufte, 1990) to the interactive info-graphics presented by the New York Times used to tell a story (Pecanha, 2013). Data visualization can be described as The use of computer-supported, Interactive, visual representations of abstract data to amplify cognition. (Card et al., 1999) In the context of this dissertation the terms data visualization and information visualization are used interchangeably to mean the same thing. Visual representations allow the user to explore, discover and gain further insights into a certain area of interest. One image can convey a wealth of information. Data visualizations can serve many different purposes. Scientific use of data visualization is to map structural data collected from studies such as brain activity, airflow, seismic data and much more. This data is usually conveyed for the purpose of allowing a scientist to more quickly analyze and identify trends, clusters or relationships. Other branches of data visualization include using visualization in a historical context and information-graphics and data journalism. Information visualization can map unstructured data such as stock market, social networking and highest grossing trends. While info-graphics and data journalism use data in order to create a narrative and tell the user a story. Another subset of data visualization is interactive visualization. Interactive visualization share most of the features
1
of data visualizations but they extend this visualization to include dynamic user interactivity. Data visualization is an effective means of communicating information because of the nature of the human vision system. Human vision can perceive and process images much more quickly than reading a page of words. Data visualizations can also be used to convey information to a group of people regardless of what language they speak. In light of this, this project intends to use the techniques found in data visualization in order to create awareness around the issue of suicide in Ireland. There have been many reports released by organizations including The National Suicide Research Foundation(Arensman et al., 2013), The Samaritans (Scowcroft, 2013) and National Office for Suicide Prevention (Malone, 2013). Suicide in Ireland is a very real issue and one that needs to be addressed and tackled in order to decrease the rate of suicide in particularly tough times such as the economic down turn Ireland is experiencing. Through the use of visual representations of the data released by such organizations such as the National Suicide Research Foundation (NSRF) and using the mortality data the Central Statistics Office (CSO) have available through the public domain the project can convey the scale of the problem to users in the environment of an interactive installation. Through interactivity the users can gain more insight and therefore more meaning from the data visualization. The ability to allow users to cross correlate their own data such as age and gender with the data displayed puts the data in context of their own lives. It is hoped through a successful narrative, the installation can open up the platform for conversations on suicide and attitudes and feelings towards it the issue. The use of a 3D gestural controller, the Leap Motion, for user interactivity allows for a more natural and intuitive way of interacting with a computer and to explore the information space.
2
1.1 Aims and Objectives
1.1
Aims and Objectives
The aim of the project is to create more awareness around the facts and figures of suicide in Ireland through the use of data visualization techniques. Data visualization has many branches in terms of use and one I intend to research is the use of Natural User Interfaces for data visualization and also the art of creating and visualizing a narrative around specific data sets related to Suicide in Ireland for a public setting. The way we use technology and interact with it has changed in the last few years with the growth of smartphones and tablets. Interaction began with the mouse and the keyboard, now there is tangible technology with touchscreen devices and new natural user interfaces. Gestural control has been in research for over 30 years (Bhuiyan and Picking, 2009) but only now are technologies really embracing it as a viable means for interaction in every day use. I aim to utilize the use of gestural control with the Leap Motion as a way of user interaction with the data visualization installation. The aim is to allow the user to explore and create insights from the data in a more natural intuitive way. When users typically browse data visualizations it is through interactions with a web-based application or through touchscreen interfaces in a museum or gallery setting. By setting the data visualization in the environment of an interactive installation within the University of Limerick, I hope to encourage people of various backgrounds, ages and gender to engage with the issue of suicide in an accessible, clear and compelling way. Data visualizations in order to be compelling and inviting to the user to explore need to have a good balance between aesthetics, design and displaying the data appropriately. The aim of the visual design aspect of this project is to use methods used commonly in graphic design and information design and cartography, especially paying careful attention to the use of color and how we perceive graphics. Visualizations need to be clear, simple and relayed to the user in order to inform, therefore a simple, beautiful, unexaggerated two-dimensional user interface will be suitable. Extensive research will be performed into the fields of HCI for infor-
3
1.2 Methodology
mation visualization and also to the field of Natural User Interface to inform the development of this project.
1.2
Methodology
I began this final year project having little to no experience in the field of data visualization but a strong interest, ambition and a love for the work of data scientists and information designers. On the surface level I presumed the only methods undertaken for data visualization were skill, a good eye and a desire to visual information and tell a story. However as I delved deeper and deeper in to my research, it became apparent that there are many different frameworks or methodologies an information designer/ data scientist can follow in order to achieve their intended goal. The ASSERT model (Ferster, 2012) is put forward by Bill Ferster in his book Interactive Visualization: Insight Through Enquiry which was developed to support the creation of visualizations that are insightful and accessible. Dan Roams Back of the Napkin Model uses simple drawings to describe ideas (Ferster, 2012) while Jacques Bertins, the french cartographer set out a storing, communicating and processing information model (Ferster, 2012) in his seminal book Semiology of Graphics (Bertin, 1983). In order to achieve the goals set out in this dissertation, I have decided to use the methodology Ben Fry presented in his doctoral thesis at the Massachusetts Institute of Technology titled Computational Information Design (Fry, 2004). This framework for visualization starts with a collection of data with the goal of answering a certain question using that data (Ferster, 2012). The 7 steps presented in the framework as a logical, utilitarian sequence, are used as a guide to achieving the data visualization goal. They are, • Acquire - obtaining the data, whether it be from a file on disk, available locally or through an internet connection. • Parsing - structuring the data and ordering it into categories. • Filtering- removing data that is not of interest.
4
1.3 Report Outline
• Mining - by using statistical methods or data mining to find patterns in the data. • Representing - using visualization techniques to communicate the patterns found in the data such as tables,graphs etc. • Refining - a process of iteration in design of a basic representation in order to make it clearer and more visually engaging. • Interacting - adding interactivity to the visualization in order to allow the user to gain further insights in a drill-down manner and allowing them to control what features are visible.
1.3
Report Outline
The remaining chapters of this dissertation are as follows: Chapter Two provides a research overview of the fields of Data Visualization, Interactivity and Suicide in Ireland. Chapter Three presents an overview of the software and technologies used for the development of this project. Chapter Four provides an overview of the development of the project and is split into subsections according to the steps defined in the methodology. Chapter Five provides an evaluation of the project in terms of problems encountered,3D Gestural Control & Data Visualization and the Installation Day. Chapter Six draws conclusions on the project and provides possible future developments and research. A series of documents have been included in the Appendix section of this dissertation. These are: • Appendix A provides developmental concept sketches & screenshots of developmental psuedo code • Appendix B presents screenshots of the developed data representation & natural user interface • Appendix C presents pictures of the installation day setup and other items.
5
2 Research 2.1 2.1.1
Data Visualization A Brief History
The use of visual representations has a long and rich history. Astronomers Galileo and Christopher Scheiner used visual representations to map sun spots over a long period of time to make significant insights on the way our universe works (Tufte, 1990).In 1904 E.W Maunder took the micro-detail of these individual sunspot observations and brought them into a macro view by mapping only the interval of +40 sun latitude. Traditionally the use of graphs, charts and maps had been used for these astronomy and scientific purposes but a move has taken place for use towards a more popular audience. In order to engage with this popular audience and explain complex facts to readers, information visualization had to move from the scientific field and “adopt a more general and popular attitude that is reflected by the appropriate form“ (Behren, 2008). “Efforts to provide better insight into complex structures and activities by expressing statistics through images“ (Behren, 2008) can be seen from the Age of Enlightment that spread across Europe in the 18th century. One of the most influential pioneers of information design and data visualization is Scottish engineer and economist William Playfair. Playfair used information visualization as a way to persuade and support his opinions through the use of graphics rather than words. Charles Josephs Minards flow map of Napoleans campaign in Russia in 1812 published in 1861 is
6
2.2 Acquire- Parse - Filter - Mine
recognized as an “outstanding masterpiece of statistical visualization and a great example of data-ink ratio“ (Tufte, 1990). Another great early example of data compression, reducing to the essential and representing complex information is Harry Becks redesign of the London Underground map in 1933 (Transport for London, n.d.). Continuing in the 20th century data visualization was used as a means to visually represent trends, maps, and information such as train timetables. In the 21st century with the explosion of data collection in modern society, information design as moved into everything from mapping the human genome by Ben Fry (Fry, n.d.) to visualizing the bloody toll of the war in Iraq through an info-graphic (Scarr, n.d.) to visualizing our daily life, as a quantified self, through the use of a mobile phone application (Feltron, n.d.).
2.2
Acquire- Parse - Filter - Mine
The data visualization design process starts with some kind of data set. Data can be acquired from many different sources including online government data websites, scraped or mined from websites through data scraping techniques and data can be collected and stored in databases for use. Data pre-processing and transformation needs to take place in order to parse, filter and mine the data in order for it to be usable before implementing and mapping the data to a visual representation. Firstly there is a raw data process where issues such as missing values, errors in input and dealing with large data sets are too difficult to process. Missing data may require interpolation while large data sets may need sampling, aggregation, filtering and partioning (Ward et al., 2010). Because of the nature of the data I acquired for this project, data pre-processing and transformation has already been performed. This clean data can then be used to create a specific visual representation. In statistics there is a classification scheme of the levels of measurement applicable to any source of information (Behren, 2008). This scheme characterizes the mathematical value of data items and the logical operations that can be performed on these values (Behren, 2008). The five levels of measurement are in hierarchical order. The lowest group is nominal. These items are categorized by names or labels and the only mathematical operation that can be performed
7
2.2 Acquire- Parse - Filter - Mine
on them is similarity, does an item belong to a certain group or not (Behren, 2008). Ordinal data is data that has a natural ordering. These kinds of data are connected to each other and have relationships, the mathematical operation used on them is greater than or less than. With ordinal data you cannot say with certainty if the intervals between these items are equal. Both nominal and ordinal data is categorical data. Metrically scaled data items are measured as either interval or ratio. Interval data items are scaled or are at a fixed interval. Between each pair of data there is an equal difference on equal intervals. Mathematically addition and subtraction can take place on these items. With a ratio level of measurement, multiplication and division operations can take place. The fifth level and highest level of measurement is the absolute level. Items in this category have naturally given features and characteristics such as population figures. Parsing data is the means by which you format and tag data so that it can be used easily within software for visualization. As mentioned previously the method for parsing data in this project is by reading in .tsv files i.e. the tab separated file format. Each piece of data in the file is converted in to its relevant data variable within the program i.e. strings, floats, integers, char and an index. In the filtering stage data is removed or filtered that is not of interest. Its important in data visualization that is of a explanatory nature that it starts with a question which is answered through the data. Means to reduce data to its essential can be normalizing items in to a workable range of numbers, most commonly converting the data in to a range of 0 to 1. In the mining step maths and statistical methods are used to mine useful information from the data. In brief, these methods include finding the maximum and minimum, finding the mean, variance, standard deviation, sorting and distance metrics (Fry, 2004). More advanced data mining methods include principal component analysis (PCA), multidimensional scaling (MDS) and fourier transforms (Fry, 2004).Methods for classification such as clustering, probabilities, self organizing maps and search & optimization can be used to evaluate the data (Fry, 2004). When mapping data however it is important to find a mapping for the data that will not create false relationships that dont really exist. It is key in these first steps of the data visualization process to ask what is the knowledge to be gained from this visualization? By
8
2.3 Designing Data Visualizations (Represent-Refine)
using acquisition, parsing, filtering and mining methods values or data dimensions become relevant to what the visualization is conveying. By taking the data items that matter the most, key relationships and patterns within the data can be communicated effectively.
2.3
Designing Data Visualizations (RepresentRefine)
In designing data visualizations they can either be for exploratory or explanatory purposes. Exploratory is appropriate when you want the user to explore a data set to search for something that was not known before. With explanatory visual representation you already know what the data is saying but you want to convey and tell a story about that data to the reader. A hybrid category of the two involves a curated data set where the user is being told a story but they have the means to interact with that data set to explore it further and create further insights (Iliinsky and Steele, 2011). The key for designing a good data visualization is a well thought-out design that follows long standing principles for color, perception and placement (Iliinsky and Steele, 2011). One such principle for good design was stated by Edward Tufte and is called the data-ink ratio (Segaran and Hammerbacher, 2009). This heuristic for design points at the designer to evaluate the proportion of ink on a page that is used to represent the data. Tufte says the larger the ratio the more efficient the graphic representation and so the greater of depth of information that can be displayed (Segaran and Hammerbacher, 2009). This principle although originally stated as means for designing data visualizations on paper can easily be applied to designing with a computer interface. Common representations of data are scatter plots, line graphs, bar charts, pie charts, maps, matrices, tree graphs, histograms, dendrograms, parallel coordinates and ISO surfaces. Each serves their own purpose as a means to represent data effectively. Careful decisions have to be made by the designer in order to use the right kind of representation. Experimenting with different ways of representation is an iterative process performed over and over again in the refinement
9
2.3 Designing Data Visualizations (Represent-Refine)
stage (Fry, 2004).The next important aspect of designing data visualizations is the use of color and also of how the reader will perceive it.
2.3.1
Color and Perception
Visual representations should be designed so as to take advantage of the incredible capabilities of the human visual system to take in a huge amount of information for processing in the brain for cognition. The brain can identify patterns and create relationships between these patterns quickly. However there is a limit to the cognitive load on the brain. Studies have shown that the span of immediate short term memory to be approximately seven items i.e the brain can remember a sequence of seven items or so of stimuli (Ward et al., 2010).Careful design must ensure to avoid producing visual inference effects through perception i.e. optical illusions or visual violations. If there are visual violations presented in the data the readers expects that violation to be there for a reason in order to specifically point to an important aspect of the data. Design patterns can help tackle common design problems within the field of information design with solid solutions(Behren, 2008). Pre-attentive variables provide a categorization for the graphic attributes of data. There are eight ways in which graphic objects can encode data and information. These ways were put forward in the Semiology of Graphics by Jacques Bertin in 1983. They are: position, shape, size, brightness, color, orientation, texture and motion. Each of these variables for encoding are used to either group similar items, distinguish one item from another or to point to relationships in the data. Position can convey a hierarchy or more importantly as we normally read top to bottom, left to right, position is important as a cultural convention in design. The shape of a data item can be circular, or square or some unique shape. Size is important for conveying the value of a data item. Orientation and motion are used to convey direction or activity in data, for example in weather maps to convey wind direction or air flow. Texture is used to differentiate between data items. In maps especially texture is used as a way of distinguishing different terrains. There is a term used to describe the proximity of objects or conceptual ideas to each other- semantic
10
2.3 Designing Data Visualizations (Represent-Refine)
distance. The brain is exceptional at grouping similar items close to each other or in relative proximity to each other together (Behren, 2008). Brightness and color are perhaps one of the most important design considerations to be made and is so discussed in brief detail here. The fundamental uses of color in visual representations is to define, label, measure, imitate reality and decorate. Color i.e. hue is not naturally ordered in our brains. Brightness and the intensity (saturation) of a color are, but not the actual color itself (Behren, 2008). Within the data visualization, no more than six to eight distinct colors should be used to encode data items close to each other. The human visual system cannot distinguish more than eight (Kultys, 2013). Colors that are used to measure should be used with intelligence and a colors saturation level should be scaled to the scale of measurement that color is representing. This principle is specifically evident in the design of maps- cartography. Color is used to measure the height of different terrains or to distinguish dry land from water. Color is used in heat maps to indicate temperatures. Another important aspect of color design is to consider those with color blindness. Color Brewer online is a tool used by designers to select good color schemes that are color blind friendly (Color Brewer 2013). The Swiss cartographer Eduard Imhof wrote much about the use of color and its importance in the design of maps but a lot of the design heuristics for cartography can be carried over to the design of any visual representation. Imhof said that pure, bright or very strong colors have loud unbearable effects when they are spread over large areas close to each other but that they can have magnificent effects when used sparingly(Tufte, 1990). Another aspect of design to consider is the use of words in the visualization. Each word needs to serve a specific purpose, if there are too many unnecessary words on the display it can distract from the visualization itself. The use of font is important too in order to avoid visual fatigue.“Clutter and confusion are failures of design not attributes of information“ (Tufte, 1990). A powerful design method for avoiding such failure and enriching content is the use of layering and separation. (Tufte, 1990) also says that, “Simpleness is another aesthetic preference, not an information display strategy not a guide to clarity.What matters is the proper relationship among information layers.These visual relationships must be
11
2.4 Interactivity
in relevant proportion and in harmony to the substance of the ideas, evidence and data conveyed.“ Gridlines should be used to simply illuminate the data, their thickness of line and color should not distract from the information sitting on them (Fry, 2004).
2.4
Interactivity
Interactivity is an important aspect of data visualization as it allows the user to transform/filter the data on display to some element they want to explore further. By allowing users to cross correlate their own age group and gender they can view data unique to them. Research within the visualization community has largely focused on interactions for a Windows, Icons, Menu, Pointer (WIMP) desktop based interface. Post WIMP or more commonly known as Natural User Interfaces (NUIs) move beyond the use of mouse and keyboard based interactions to use more ”natural” interactions. Presented here is an overview of Natural User Interfaces and a research overview at the intersection of mouse-less interfaces and data visualization.
2.4.1
Natural User Interfaces (NUIs)
Rapid advances in areas such as computing performance, processing power, 3D cameras, motion detection and machine vision algorithms have made the use of gesture based interfaces and technologies more affordable and viable (Garber, 2013). The popularity and increasing adoption of people using devices such as smartphones and wearables such as smart watches, and with Heads Up Displays such as Google Glass on the horizon, there is an indication to the desire for humans to decrease the use of traditional interfaces and “eliminate the use of intermediate devices such as mouse or keyboard to control computer.” (Blake, 2010). Therefore designers, technologists and developers are looking to create more “natural” interfaces and interactions to move beyond the desktop paradigm to more natural user interfaces (NUIs). They are looking to gesture, touch, freeform and other input modalities to create a more embodied human computer interaction.
12
2.4 Interactivity
Liu (2010) defines NUIs as “an emerging computer interaction methodology which focuses on human abilities such as touch,vision,voice,motion and higher cognitive functions such as expression, perception and recall”. Liu (2010) further defines NUIs by characteristics such as user-centered, multi-channel, inexact, high bandwidth and behavior based interaction. User-centered describes the dialogue between a user and the machine, with the machine as an active participant responding to various human “natural” gestures. Multi-channel refers to making use of one or more of the human sensory and motor channels in order to enhance the “naturalness” of the interaction. Inexact describes how a smart interface can recognize gestures and react and respond to a users intent. High bandwidth refers to an increase in the users freedom of expression when interacting with a NUI. Behavior based interaction refers to interactions based on human behaviors i.e. capturing the natural way humans communicate through gesture, voice and expression. Behaviors are captured through the “positioning, tracking, movement and expression characteristics of human body parts to understand human behavior and action.” (Liu, 2010). NUIs are designed, meaning that they require a certain degree of careful planning in advance of development(Blake, 2010). Compared to WIMP interfaces, NUIs should be designed to make use of natural human behaviors such as touch and gesticulation. The ability for a user to have direct interaction with content and interface elements means according to Blake (2010) that direct manipulation should be the primary interaction goal.
2.4.2
Hand Based Gesture Interaction
Gestural Interfaces are of particular interest in the context of this project. A gesture is a non-verbal communication made with a part of the body. We use gesture instead of or in combination with verbal communication (Saffer, 2008). Gesture comes naturally and is intuitive. Most gestural interfaces can be classified as either touchscreen or free-form (Saffer, 2008). Examples of touchscreen devices include smart phones, kiosks in public areas and information points. Examples of free-form gestural devices, although not as prevalent as touchscreen are popular in gaming and interactive art installations and displays with the Kinect motion
13
2.4 Interactivity
Sensor from Microsoft (Kronlachner, 2013) and now with the release of the Leap Motion 3D controller. Hand based gesture interaction or free-form interaction can be classified into two categories, unimanual and bimanual (Kavanagh, 2012). Unimanual describes interactions using one dedicated hand or finger, usually with the preferred hand, while bimanual interactions describe interactions requiring the use of two hands. Boussemart et al. (2004) proposed that unimanual interactions be dedicated to selection/pointing tasks while the other free hand is used to perform actions on the selected objects. Hinckley et al. (1997) comments that unimanual control is more reliant on visual feedback or system status as a consequence of user intent. Hinckley et al. (1997) also argue how this reliance for visual feedback carries a heavy “cognitive load in task manipulation.” When two hands are used, this cognitive load is reduced and so can encourage more efficient exploration of the task solution space (Hinckley et al., 1997).Guiard (1987) model of asymmetric division of labor further supports the idea that we assign a role to our preferred hand for either “gross or fine movements ...which allow efficient task performance.” Baudel et al. (1992) have identified some limitations and challenges to the use of hand gestures for interaction. They comment that with an ideal hand gesture input device, the hand would become the device. However most people trying a hand gesture system for the first can experience frustration and disappointment. “Immersion Syndrome” and “Segementation of hand gestures” are named as two primary reasons. The immersive nature of gesture controlled systems means that the system is attempting to capture every motion of a users hand in order to interpret an intended gesture. However some gestures are unintentional and so Immersion Syndrome can occur. Baudel et al. (1992) comment that the “the user can be cut from any possibility of acting or communicating simultaneously with other devices or persons.” Segmentation of hand gestures refers to the need for the system to interpret gestures from a continuous flow of interaction into meaningful control commands. In comparison to the use of the mouse and keyboard,Baudel et al. (1992) identified other secondary challenges such as user fatigue and lack of comfort.
14
2.4 Interactivity
2.4.3
Intersection of Natural User Interfaces & Data Visualization
How a user interacts with a computer is a massive field of research called HumanComputer-Interaction (HCI) and one that is beyond the scope of this report. It was deemed necessary however to give a concise overview of the exciting, young area of research focusing on the move from WIMP interfaces to mouse-less interfaces using new input technologies for interactive data visualizations. Interaction is difficult to define and so interaction models and taxonomies exist to guide the design of interactions for HCI. A broad yet all encompassing definition of an interaction model was described by Beaudouin-Lafon (2000) as “a set of principles, rules, and properties that guide the design of an interface. Such a model describes how to combine interaction techniques in a meaningful and consistent way and defines the “look and feel” of the interaction from a persons perspective.” There are several such models/taxonomies to serve as design guidelines for desktop based interactive data visualizations. However researchers within the data visualization community are now asking are these design guidelines applicable for data visualizations on NUIs? Yi et al. (2007) writes that “even though interaction is an important part of information visualization, it has garnered a relatively low level of attention from the visualization community” compared to research on the techniques to represent and visualize data. Even less attention has been paid to “leveraging the possibilities of novel interaction modalities and models”’ for data visualization (Lee et al., 2012) There have been seven general categories identified by Yi et al. (2007) as data visualization specific interaction techniques based on a users intent. They are select, explore, reconfigure, encode, abstract/elaborate, filter and connect. Another useful reference model for designing more general interactions for data exploration is Shneiderman (1996) mantra “overview first,zoom and filter, then details on demand.” These design considerations are often done in the context of data types and tasks or as part as an information visualization pipeline, such as the methodology presented here in this dissertation. These taxonomies often focus on interactions in terms of tasks to perform transformations on the data and its representation (Lee et al., 2012).Isenberg et al.
15
2.4 Interactivity
(2012) propose that these existing HCI information visualization taxonomies/models be extended or rethinked with regards to data visualization for NUIs. They propose that researchers should not only consider interactions techniques that are not only task, ’data-centric’ but also incorporate ’people-centric’ interactions too. Lee et al. (2012) propose that the information visualization community look at interactions from a more novel, broader perspective “taking into account the entire spectrum of interaction’.’ Furthermore, they propose that with the inclusion of existing interaction techniques, researchers can also “capture the capabilities of newly available interface and interaction technologies.” Lee et al. (2012) describe three principle dimensions for a new approach to designing interactions for a NUI. They are About the Individual, About the Technology, the Sociality and the interplay between these. About the individual concerns a users degree of intent, the interaction distance between the user and the technology, the freedom of expression and the impact of system feedback (Lee et al., 2012).About the technology concerns input type, input resolution and output type (Lee et al., 2012).Sociality concerns data visualization and NUIs for collaborative environments. Focusing on about the Individual and about the Technology, what is of particular interest is an increased freedom of expression with new technologies such as 3D gestural control and touchscreen devices. Exploring different input modalities such as these may allow the use of more degrees-of-freedom interaction and thus reduce the number of necessary UI components. This can help analysts to focus their attention to their main task, visual exploration of the data rather than the manipulation of the interface (Lee et al., 2012). Providing a higher degree of freedom however can mean less accurate input resolution. Lee et al. (2012) comment the need to find a balance between a high level of freedom and a high level of recognition reliability and accuracy.Yi et al. (2007) view interaction techniques in data visualization as the features that provide users with the ability to directly or indirectly manipulate and interpret representations.
16
2.5 Suicide in Ireland
2.5
Suicide in Ireland
Globally every year, almost one million people die by suicide, this corresponds to one death ever 40 seconds (WorldHealthOrganisation, 2013). More people die by suicide each year than by murder and war combined (Samaritans, n.d.).Between 10% and 14% of the general population have suicidal thinking throughout their lifetime (Samaritans, n.d.). At a national level, Kevin Malone in his remarkable survey on suicide in Ireland 2003-2008 (Malone, 2013) states that suicide is the leading cause of death for young men in Ireland. Although great strides have gone into the research of conditions such as diabetes, heart disease and cancer research in to suicide in Ireland is small in comparison.(Malone, 2013) also commented that, we now have a wave of young people in Irish society for whom suicide rates among their peers have increased substantially from those of their parents. Not only is suicide likely to remain the leading cause of death in these children in the next decade, but it will also be the leading cause of peer bereavement. Maloness goal as stated in the report is to move out of knowledge vaccum, move beyond awareness to knowledge and sustained and integrated action. Malone looks at some of the causes of suicide from conducting case studies and interviews with bereaved families.His findings suggest that toxic humiliation and polyvictimization are new terms that we must learn about and understand (Malone, 2013).He also identitifies a previously unreported age dependent epidemiological movement for suicide. In an epidemiological study 1993-2008, it was found that there was increase in sucide rates for both males and females for children under 18 years old. This equates to the death of a child (under 18) in Ireland every 18 days (Malone, 2013). In my research I found that suicide is estimated to be under-reported for many reasons including stigma,religion and social attitudes.Many suicides are hidden among other causes of death,such as road traffic accidents and drowning (Samaritans, n.d.). Another reason to be believed for the under reporting, approx.
17
2.5 Suicide in Ireland
40% (Malone, 2013), of suicide in Ireland is the fact that suicide only became decriminalised in Ireland in 1993. I found the research being conducted by the National Suicide Research Foundation and National Office for Suicide Prevention to be invaluable for gaining an insight in this issue. In their report titled Second Report on Suicide Support & Information Systems published in 2013, they present findings from a detailed case study of 307 cases of suicide in Cork conducted from Septmeber 2008 to June 2012. The research revealed facts such as that March,May and October were exceptional in that in each of these months more than 10% of suicides occurred in total 33.1%. They also outline findings from interviews with bereaved families possible motivations for taking ones life, method used to die by suicide and occupation of the deceased. Further discussed in the report are risk factors associated with suicide and suicide clustering and contagion identified in Cork County in 2011. The National Registry of Deliberate Self Harm was established by the National Suicide Research Foundation in 2001 (Arensman et al., 2013).This registry is a national system for monitoring the occurrence of deliberate self harm in Ireland.It is used to identify trends over time and to help in the progress of research and prevention (Arensman et al., 2013). It was decided early on in my research that for the purpose of this project and to keep the level of data and information at a manageable level to not include findings within the National Registry of Deliberate Self Harm.
2.5.1
Sources of Data and its Classification for Suicide in Ireland
Having researched suicide in Ireland through reading reports and academic research findings, I contacted the National Suicide Research Foundation for more detailed data sets. Statiscian Amanda Wall emailed me an excel file of Suicide Mortality rates per 100,000 population by sex and five year age group 1960-2011 and pointed me to the section of their website which showed suicide by area of residence 2004-2010,rate per100,000 population. The Central Statistics Office website provides a detailed database on vital figures related to births, marriages
18
2.5 Suicide in Ireland
and deaths in Ireland. I found out that in order to understand the data being provided I needed to understand the measures used to calculate rates of mortality within a population. The number of deaths in the population during a specified time period is divided by the number of persons in the population during the specified time period (StatsIndiana, 2013).In vital statistics the denominator stays the size of the population at the halfway point of the time period and the fraction is usually expressed per 100,000. Age-specific morality rate is limited to a particular age group (StatsIndiana, 2013). National Statistics Offices in all European countries and in many non-european countries classify the cause of every death according to the World Health Organisations International Classification of Diseases,Injuries, and Causes of Death (Corcoran and Arensman, 2010).Although the main variables collected for vital statistics include date,address,place of death,cause,occupation,age,sex,marital status, data available on suicide from the CSO is limited (Scowcroft, 2013). At the CSO cause of deaths are classified according to X+ number format. Particulary cause of death by suicide is classified by X60- X84 and suicide numbers and rates include only deaths classified under these codes. Furthermore in my research and through email correspondance with a statistican called Kevin OShea at the CSO I found out that occupation of the deceased at time of death was also classified in to various categories called social economic groups.They are, • 0- Farmers, relatives assisting and farm managers • 1- Farm labourers and fishermen • 2- Higher Professional • 3- Lower Professional • 4- Employers and managers • 5- Salaries employees • 6- Non-manual wage earners (white collar) • 7- Non-manual wage earners (other)
19
2.5 Suicide in Ireland
• 8- Skilled manual workers • 9- Semi-skilled manual workers • - Unskilled Manual workers • & Unknown. This finding was interesting for me as it revealed something more human in the data- what that person was working as when they died and also which social economic group had been more suspectible to suicide in a given year. Kevin provided me with excel files containing rate of death by suicide for the years 2001-2011 by age group and gender.
20
3 Overview of Software and Technologies 3.1
Software
For the purpose of developing and completing this project, Processing was used to implement visual coding for the visualization of data, the design of the NUI and also to implement interactions with the Leap Motion 3D gestural controller.
3.1.1
Processing
Processing is an open-source integrated development environment (IDE) developed by Ben Fry and Casey Reas at the MIT Media Lab over a period of time in the late 1990s and early 2000s. The language is based on Java. It was created to provide designers, students, educators and artists an easier way to creatively code by hiding a lot of the low level technicalities of coding. The great success of Processing is due to its open source nature and a committed community of third party developers and contributors extending the libraries of Processing to increase its functionality across the different digital creative mediums. With such an active community, a wealth of documentation on the use of Processing with data visualization and its ease of use, it is the ideal platform for implementing this project.
21
3.2 Technologies
Processing applications contain two basic functions setup() and draw(). The setup() function is called only once when the program is ran. This function commonly initializes constant parameters to the program such as size(), the size of the sketch, background() the background color and an anti-aliasing method named smooth(). The draw() function is continuously called according to the animation framerate specified by framerate(). Common methods used in Processing for basic drawing are line(), point(), ellipse(), rect(), arc(). Custom shapes can be drawn by using methods such as beginShape(), endShape() and vertex(). Drawing text to the screen can be achieved by using methods such as textFont(), textSize() and text(). Colors are commonly specified as RGB values, however processing can provide the means to use color values in terms of HSB values. Methods such as fill() and stroke() are used to change colors. To specify the opacity of a color, an alpha value can be passed as an argument to color methods. Basic mathematical functions in Processing include sqrt(), normalize(), cos(), sin() and round(). To remap values in Processing from one range to another the map() method can be used. Useful methods for reading in files to Processing and parsing values include parseFloat(), split(), trim(), getFloat, loadStrings() and getString(). Ben Fry has included an invaluable class within the environment called the Table class for reading in data from tab separated file types. Tab separated files .tsv are similar to comma separated files .csv only for instead of separating where theres a comma, data is separated where there is a tab.
3.2
Technologies
The means by which the user interacts with the data visualization was implemented using the 3D gestural controller the Leap Motion. Presented here is an overview of the Leap Motion controller technology. Also provided is an overview of using the Leap Motion Processing contributed library, LeapMotionP5 to implement interactions.
22
3.2 Technologies
3.2.1
Leap Motion
The Leap Motion was first available to a small number of developers before a general release for public consumption in July 2013. The controller is a small sardine can sized device, USB compatible, which is placed sitting up in front of or near a computer. Inside the controller are two monochromatic IR cameras and three infrared LEDs(YouTube, 2013). The controller has an interaction area of 8 cubic feet, a 150 degree, inverted pyramid field of view centered on the device 3.1. The device employs a righthanded Cartesian coordinate system 3.2. The X and Z axes lie in the horizontal plane, with the Z axis for depth having positive values increasing toward the user (LeapMotion, 2013a). The Y axis has positive values increasing upwards. The LEDs generate a 3D pattern of dots of IR light that allows the device to track all of the fingers on both hands up to a 100th/millimeter. The cameras generate at a rate of over 200 frames per second sending reflected data information to the computer to be analyzed by Leap motion software (LeapMotion, 2013b). The Leap motion compared to the Kinect motion sensor allows for a more finely precise control because of the smaller interaction area and the higher resolution of frame data. The Kinect is more suited for whole body tracking while the Leap can accurately track hands and individual fingers while recognizing different hand gestures such as a closed fist, open palm, Swipe, Screen Tap and Circle gestures. Gestures are recognized by a movement pattern algorithm provided by the Leap Controller object. Motion tracking by the device provides updates as a set, or frame, of data. Each Frame contains a list of tracking data for hands, fingers and tools as well as recognizing certain gestures. The Hand model provides data about the “position,charcteristics, and movement of a detected hand as well as lists of the fingers and tools associated with the hand.” (LeapMotion, 2013a). The physical characteristics of fingers and tools are abstracted into a Pointable object. Finger tip positions and direction are stored as vectors providing information of a fingers position and the direction it is pointing. This level of control is why for the purpose of my project I decided to use the Leap motion. Forwardly thinking I believe the device in an installation setting for either museums or public information visualization areas will grow because of the
23
3.2 Technologies
Figure 3.1: Leap Motion Interaction Area
Figure 3.2: Leap Motion Coordinate System
lack of actually touching anything physical, sanitary wise its more suitable that traditional interaction devices and in the scenario of those who have disabilities it is ideal. Surgeons are starting to use the Leap motion in this way because of the issue of sterilization (IMedicalApps, n.d.).
3.2.2
Onformative Leap Motion Library for Processing
Processing also has the added functionality to import external libraries into its IDE. One such library useful for the development of my project was a simple wrapper library for the Leap Motion Java API called LeapMotionP5 for Processing developed by the Onformative studio for generative design http: //www.onformative.com/lab/leapmotionp5/. It was designed to give Processing
users and novice Leap Motion developers easy access to the Leap Motion Java API and accessing Frame data. Functions provided include ArrayList functions which allow developers to access currently tracked hand and finger positions and store these positions in a list. By implementing a for loop, Frame and Pointable data can be updated and then dynamically stored as an object of the PVector class.The PVector class is used to describe two or three dimensional vectors and is often used to to store the x, y and z components of a vector. For example, for ( F i n g e r f i n g e r : l e a p . g e t F i n g e r L i s t ( ) ) { PVector f i n g e r P o s = l e a p . getTip ( f i n g e r ) ; } Functions such as getFinger() return a finger by nr. The first finger detected is identified within the library as nr 0. The position of the fingertip can be mapped to the size of the sketch window with getTip(). The getHand()
24
3.2 Technologies
function returns a Hand by a number. The hand that was detected first is assigned an id of nr 0, while the second hand detected is assigned an id of nr 1. The getPosition() function for the Hand returns the average position of the hand palm. Gesture recognition is enabled through callback methods called once a gesture is recognized. For example to enable a Swipe gesture to be used, in the setup() method of Processing, the following code is used, l e a p . e n a b l e G e s t u r e ( Type . TYPE SWIPE ) ; For a Swipe gesture then to be recognized once enabled, the following method is implemented, void swipeGestureRecognized(SwipeGesture gesture). If a Swipe is now detected a swipe boolean can be changed to true instead of false.
25
4 Development The development of this project can be split into three core components. These are the representation of data, implementing interactivity with the Leap Motion and the Installation Environment. As previously stated, Ben Frys framework for interactive data visualizations, acquire-parse-filter-mining-representingrefinement-interact served as a solid guide for approaching the project. Frys book Visualizing Data (Fry, 2008) also served as an excellent reference for the implementation of code in Processing and it should be noted that code presented here is based on examples from this book. The interact element of Frys framework concerns implementing interaction for WIMP interfaces and so this element as been extended in the context of research previously presented for interactive data visualizations for NUIs.
4.1 4.1.1
Visualization of Data Acquire
As previously stated in Section 2.5 data was mainly acquired from the National Suicide Research Foundation (NSRF) and the Central Statistics Office (CSO) by email correspondence. Facts about suicide in Ireland were pulled from reports available for public consumption from the Samaritans (Samaritans, n.d.), A Survey of Suicide in Ireland 2003-2008 (Malone, 2013) and the NSRF (Arensman
26
4.1 Visualization of Data
et al., 2013).An excel file containing data collected for deaths by suicide in Ireland, by gender, year and by age group, for the years 1960-2011 was sourced from the NSRF. The raw data had already been pre-processed for ethical reasons to contain only values reflecting Rate per 100,000 population. I wanted to use and visualize data that showed a more human element to lives lost. Having researched how deaths by suicide in Ireland are recorded by the CSO, I contacted Kevin O’Shea at the CSO to obtain data on the occupation of the deceased at time of death, by year, by age and by gender. As previously stated this data is categorized by social economic group (SEG) (see Section 2.5). SEG data was only requested for the years 2001- 2011 initially. 2012 and 2013 SEG data was unavailable with my first contact, due to being only provisional figures. However at a later stage, the CSO provided me with actual figures for the year 2012. Data had also been sourced from the CSO providing figures of death by suicide, by gender and cause (X cases) i.e. the method used. Early on, I felt although this data is useful for psychological and epidemiological research, it was felt too shocking for the purpose of this project considering the installation could attract younger viewers.
4.1.2
Parse
Parsing is concerned with structuring and ordering of the data into categories. Both the SEG and NSRF data was categorized firstly by year. Yearly data can be described as interval variables. The difference between each year is meaningful as it can indicate a decrease/increase in deaths. Secondly both the SEG and 1960-2011 data were categorized also by gender i.e. figures for Males and figures for Females. Classification for gender type and social economic group is of nominal type. The NSRF had categorized data provided in to age categories. These categories were ages 10-14, 15-19, 20-24, 25-29, 30-34, 35-39, 40-44, 4549, 50-54, 55-59, 60-64, 65-69, 70-74, 75-79, 80-84 and >85 respectively. Age group data can be classified as quantitative and of ratio type. SEG data also initially contained age group data. All of the data was provided by means of Excel spreadsheet files. File formats were converted from .csv to .tsv files to
27
4.1 Visualization of Data
make them readable and usable within Processing. The SEG data by year were segmented into their own individual .tsv files and were ordered as follows, the year 2001 data named 0.tsv, year 2002 data named 1.tsv and so on for each consecutive year till the year 2012, named 11.tsv. This ordering and segmentation were performed to allow the files to be read individually into Tables within Processing. FloatTable and Table classes within Processing read data values into a 2 dimensional array reflecting the rows and columns of the Excel file. It should be noted from here on in that array index values begin from 0. Therefore Row 1 of all the .tsv files containing column headings start from row 0 in Processing and similarly for all column or col values. For the NSRF data, column headings were the different age groups, while each row, col 0 contained the year names for 1960-2011. Column headings for the SEG .tsv files were SEG, Male and Female respectively, while each row, col 0 contained each SEG by name. Names were converted to String, rate values converted to Floats within the FloatTable class and SEG values, being integers were passed as Integer values to the Table class.
4.1.3
Filter
Filtering the data is concerned with removing data that is not of interest. As this Installation would be taking place within the University of Limerick, it was essential to have core age groups represented which reflected the common ages of students and staff, i.e. from ages 17 up to the common retirement age of 65. Therefore I removed the following age groups, 10-14 and totaled the age groups >65 to one figure and categorized this summation as >65. SEG age group data remained within the .tsv files while developing a suitable visualization solution for SEG data, however when the coxcomb diagram (to be discussed in subsection 4.1.5.2) was realized, age groups were removed and a total figure for all ages by gender remained.
4.1.4
Mining
Mining concerns using statistical methods or data mining to tease out interesting patterns in the data or in order to place the data in a mathematical context.
28
4.1 Visualization of Data
Using the getTableMax() and maxRowSum() methods, the maximum and minimum values for males and females were obtained. It is necessary to obtain maximum and minimum values in order to represent data on screen at a proper scale. Within Processing data values each had to be remapped using the map() method in order to make values useable for visualization. It is necessary to loop through each index of the 2D data array, while the getFloat() method is used to grab and store the data value at that current index as a float. The map() method takes the current value and remaps it to a value in a specified range. For example to obtain values from the FloatTable object data, for ( int row = 0 ; row < rowCount ; row++) { i f ( data . i s V a l i d ( row , c o l ) ) { f l o a t v a l u e = data . g e t F l o a t ( row , c o l ) ; f l o a t x = map( y e a r s [ row ] , yearMin , yearMax , plotX1 , plotX2 ) ; f l o a t y = map( value , dataMin , dataMax , plotY2 , plotY1 ) ; vertex (x , y ) ; } }
4.1.5
Representing
Representing is concerned with visualizing data values as a basic form.There are two core data representations within my project. Firstly there is the line graph visualization of the NSRF data, i.e the mortality rate per 100,000 population for suicide in Ireland, from the years 1960-2011, by gender and by age groups 15 to >65. Secondly, the coxcomb diagram visualization of the SEG data from years 2001-2012, by gender and by social class. 4.1.5.1
Time Series Line Graph
Due to the large range of years for the NSRF data and its time series based nature, I initially visualized these values as a bar graph for male, female and all values, with the x axis labeled as years and the y axis labeled as rate intervals. However I wanted to convey more of a sense of the values moving from one year to the next with a more definite shape to create a more visually engaging
29
4.1 Visualization of Data
Figure 4.1: Time Series Bar Graph
Figure 4.2: Time Series Bar Graph Male
graph. The bar graph was deemed unsuitable for representing both male and female data on the same graph for comparison. This could of been achieved by placing a male bar beside a female bar for each year but then the visualization would be too noisey. Therefore I decided to then represent the time series data as a line graph. The axes remained the same and I encoded the male line graph with a grey color and the female line graph with a yellow color1. These colors contrasted well to show the comparison both genders but were neutral enough to not conflict with the background and are colorblind friendly. Initially I considered having the time series plot only present on the top half of the screen in order to accommodate facts about suicide underneath. This idea was quickly abandoned as I remembered Tuftes data to ink ratio and the position of the representation changed to take front and center of the screen. Adjusting the alpha values of the male and female colors better represented the comparison between genders. The core code implemented in Processing to draw data to the screen as a filled in line graph can be described with the following example source code. This method does not return anything and so is of type void. The method drawDataAreaTwo() takes an argument of type int. The current column of the data2 array is passed as an argument to the method. The shape of the line graph itself is drawn as a custom shape object with the remapped data values defining the points of the line with vertex(xx,yy). When drawDataAreaTwo() is called it is looping through each row of the column and passing the value it gets on each iteration and storing this value in value2. value2 is then remapped to the range of the time series plot. The xx value plots the year intervals. // FEMALE void drawDataAreaTwo ( int c o l ) {
30
4.1 Visualization of Data
noStroke ( ) ; beginShape ( ) ; for ( int row = 0 ; row < rowCount2 ; row++) { i f ( data2 . i s V a l i d ( row , c o l ) ) { f l o a t v a l u e 2 = data2 . g e t F l o a t ( row , c o l ) ; f l o a t xx = map( y e a r s [ row ] , yearMin , yearMax , plotX1 , plotX2 ) ; f l o a t yy = map( value2 , dataMin , dataMax , plotY2 , plotY1 ) ; v e r t e x ( xx , yy ) ; } } // Draw t h e lower−r i g h t and lower− l e f t c o r n e r s v e r t e x ( plotX2 , plotY2 ) ; v e r t e x ( plotX1 , plotY2 ) ; endShape (CLOSE ) ; } This method was reused and renamed in order to plot Male values with the method drawDataArea(). Other methods such as drawYearLabels, drawRateLables() drawed axis labels for rate and year to screen. Small pink dots were implemented to encode data point on the line graph to test the data highlight, details on demand functionality and also to test if the pink served its purpose of both complementing other colors but also popping out to indicate an important point of interest 2.The only UI elements visible so far in development are circular buttons placed on the left side of the graph. When both the male and female filled line graphs are drawn together to the screen this is known as a stacked line graph.
Figure 4.4: Line Graph Stacked Female& Male
Figure 4.3: Time Series Line Graph
31
4.1 Visualization of Data
4.1.5.2
Coxcomb Diagram
Upon investigating the SEG data, I soon realized there were patterns emerging with regard to which social economic groups had the largest proportion of deaths by suicide.Year by year from 2001 - 2012, social economic groups 8, 9 and , Skilled manual workers, Semi-skilled manual workers and Unskilled manual workers respectively, were consistently the most common besides those deaths recorded as Unknown occupation. This lead me to ask many questions including did the construction sector crisis with the fall of the economy in 2002 in Ireland lead to this increase amongst manual workers? Why is suicide more prevalent amongst certain social economic groups and not others? It was important that visitors to the installation could identify similar patterns within the SEG data and so the data was represented as a Florence Nightingale coxcomb diagram. Other potential visualization solutions considered and loosely developed were a chord diagram to represent the relationship between SEGs and age groups 4 and a circle packing algorithm to show value differences between SEGs with circles 3. However they were rejected for reasons such as the complexity of attempting to implement code, visual inference and unsuitable for visualizing comparisons between SEGs effectively. A coxcomb diagram shows the statistical comparison of elements on a polar diagram. It is used to plot cyclic phenomena and Nightingale who founded the diagram used it to visualize causes of mortality in the army of the east of the Crimean War (UnderstandingUncertainty, 2008). The code used to implement the coxcomb diagram was based on open source code provided by Ale http: //www.openprocessing.org/sketch/35488 and adapted for the purpose of this
project. The code is complex and so a concise explanation of it is as follows. All sectors of the polar diagram must have equal angles. 360 ◦ ÷ by the number of social economic groups gives equal angles for each SEG sector. The area or how far the sector spreads out, reflects the value of the SEG data. Each SEG sector of the polar diagram not only represents a particular SEG group for the selected year but also contains information by gender. Initial development had UI elements placed above the diagram and male and female sectors colored pink and blue 5. These elements were changed and refined in the next stages of development.
32
4.2 Adding Interactivity with the Leap Motion
4.1.6
Refinement
The refinement step concerns as the name suggest, refining basic representations to clearly clarify the design and visualization of the data. Described here are adjustments made to realize the completed data visualizations. The initial background color of white for all screens was changed to a light grey for signal enhancement and to improve the accuracy of user readings. White was decided to be too bright and distracting a color when displayed on a larger screen and could lead to viewer fatigue. White was instead used in smaller amounts to color the all UI buttons. There were several iterations in designing the position of other UI elements such as the age group “buttons” 6. Eventually the age group buttons were split into two groups and displayed left and right of the line graph. Titles and information which dynamically updates readings through user interactions where placed above the line graph. These areas of text where encoded with a rustic orange color. A reference legend for gender was also place above the line graph. The pink of the data highlights was adjusted to more intense pink color and points where made bigger 6.1. Refinement of the coxcomb diagram was little as the basic representation had being strong enough to carry the data. The tips or the outer radius of the sector were sized and colored yellow from pink to represent female figures, while the inner radius was sized and colored grey from blue to represent male figures. The changes in color were implemented to have consistent colors for male and female values throughout the installation. Buttons to control which year is shown were placed to the left of the diagram. A legend to the top right of the diagram updates information depending on which sector is selected by the user and which year is currently being shown. Labels displaying the name of the social economic group were drawn to screen at its relevant sector. Finally a title for the diagram and a reference legend displayed to indicate gender 6.1.
4.2
Adding Interactivity with the Leap Motion
A significant part of the development of this project was implementing interactivity with the Leap Motion controller. Several methods were coded in Processing in
33
4.2 Adding Interactivity with the Leap Motion
order to effectively implement user interactions including transforming visualizations based on gender, age group and year and to provide details on demand when requested. The challenging part of this development was to design interactions in terms of NUIs with the Leap Motion and to try and capture the possibilities 3D gestural control could offer for interactive data visualizations. The design and development of gestural interactions can be described in terms of unimanual and bimanual interactions.
4.2.1
Interact
As presented in Section 2.4.2 unimanual interactions are used mostly for selection/pointing tasks. Therefore interactions such as hovering and ”touching” buttons where implemented using Pointable or Finger objects.Button, controlRow and ButtonOther classes were coded to create button objects. Various button object features were defined by passing arguments to constructor methods. These included defining the buttons size, position, shape and more importantly boolean variables defined the buttons state i.e if it was on or off, true or false respectively. In order to turn a button on or off, initially a Tap gesture was implemented. However through testing the system, the Tap gesture was deemed not reliable and accurate enough to provide such functionality. Therefore a method utilizing the Leap Motions Z, depth value was implemented. This method checks if a Finger object has reached and gone below a certain threshold to implement a “touching” event. Once a “touching” event is recognized a touch boolean equates to true, changing a buttons state to on. The code is as follows, i f ( pointer . z < 0.2) { touch =true ; } e l s e touch=f a l s e ; In addition to this and to ensure more accuracy for button “touches”, another method checks if a Finger is also over a certain button. This method is implemented by checking the distance of the pointer.x and pointer.y vector components to the x and y positions of the buttons diameter on screen. b o o l e a n o v e r C i r c l e ( int xB , int yB , int d i a m e t e r ) { f l o a t disX = xB−p o i n t e r . x ;
34
4.2 Adding Interactivity with the Leap Motion
f l o a t disY = yB−p o i n t e r . y ; i f ( s q r t ( sq ( disX)+ sq ( disY ) ) < d i a m e t e r / 2 ) { return true ; } else { return f a l s e ; } } Due to the lack of tactile feedback with gestural control, visual feedback of system status was very important. If a finger was hovering over a button, a green ring appears around the button and if the button is “touched” the buttons color changes momentarily. In order to update the Line Graph visualization to reflect a users selected age group, a novel interaction for direct manipulation was implemented. This unimanual interaction allows the user to grab a certain age group button with their finger and drag the age group to the middle of the screen to make a “connection” and update the visualization accordingly 6.1. The user can then reset the position of the age group button back to its original starting position by performing a Tap gesture on the far right of the screen. To make use of using more than one finger, unimanual interactions for selection, I implemented a gender view interaction that makes use of a combination of finger patterns. I wanted to try and minimize the use of one finger interactions as much as possible to combat issues identified such as user fatigue. The gender view interaction also makes use of modal spaces to improve accuracy and the reliable recognition of gestures. A countFF++ variable is updating with the draw loop to store the number of fingers currently tracked within the Leap field of view. If pointer.x and pointer.y i.e a finger, is within a certain part of the screen, this space triggers a gender view mode. If the countFF is equal to 0, i.e a closed fist with no fingers are detected, the line graph updates to show only the male data. If countFF is equal to 1, i.e one finger is detected, the line graph updates to show only the female data while if countFF is equal to 5, i.e five fingers detected, the line graph updates to show both the male and female data. What is most attractive about gestural control are the possibilities it can offer for bimanual interactions. I implemented one main bimanual interaction within my project for the Line Graph visualization. This novel interaction allowed the
35
4.3 Installation Environment
user to perform tasks on the data such as accessing details on demand, data highlights and comparisons. These user tasks for data exploration were achieved by developing two movable reference lines. The green line moves up and down along the y axis of the Line graph, while the white line back and forth along the x axis. These lines are made movable by two hands. The green line follows the movement of the left hand up and down while the white line follows the movement of the right hand back and forth. At the intersection point of these two lines on the line graph, the relevant data highlight for that year and gender is shown. The pink data highlight is accompanied by a details on demand dialogue reflecting the rate value for that particular year. If both gender line graphs are selected and visible, both rate values for male and female figures will be shown. In terms of data exploration and performance of task strategy this bimanual interaction works efficiently with reliable interaction. Furthermore, the use of the y axis reference line not only serves to enable a data highlight and details on demand task but also serves as a tool for comparing rates. With the y axis reference line at a certain rate interval, the rate figures that fall above and below that reference line can be compared for an increase or decrease in mortality rates from a certain year 6.1. Interactions using the Leap Motion controller for the coxcomb diagram were small in comparison to the line graph visualization. Interactions for performing tasks on this visualization included selection of the year with a finger and accessing details on demand by using a finger to hover over a certain SEG sector. The status of the hover interactions were encoded using a light pink color. If a users finger was over a sector, the sector color would turn a light pink. A small white dot served as an interaction aid to the user to know where their finger was on screen. If the users fingers hovered over a year button and a “touching” event was triggered, a black ring appears around the button providing visual feedback of system status.
4.3
Installation Environment
Development thus far as been described in terms of the interactive data visualization aspect of the project. Described here is the development of the installation
36
4.3 Installation Environment
aspect of the project.The decision to base the interactive data visualization within an installation environment was to enable walk up and use scenarios and to place the work within a public setting. The main aim of the project is to create more awareness of the issue of suicide in Ireland through the means of an interactive data visualization. By placing the project in a public setting it was hoped that it would invite curious and random visitors. Kwon (2002) described an important feature of site-specific work as how it “requires the physical presence of the viewer for the works completion”. This feature of site-specific work I feel can be applicable to the essence of this project, as the interactive data visualization requires not just viewers but also active participants to make this project viable. Both the line graph and coxcomb visualizations are contained within their own individual screens. Other screens were developed to build a narrative to the installation. These are an ambient screen, a screen displaying facts on suicide in Ireland and a menu screen. The narrative structure for the installation is of a “martini glass” style. This style of narrative initially begins with a tight path (stem of the glass) and then opens up later for free exploration (body of the glass) (Segel and Heer, 2010). The ambient screen sits idle on display until a visitor starts interacting with it 6.1.The move from the ambient screen to the next screen is enabled through a Swipe gesture. On the next screen, the visitor can pass their finger or hand over points displayed around a circle in order to display a fact about suicide in Ireland. This screen serves two purposes. Firstly it allows the visitor to get used to what gestural control feels like and to practice unimanual interactions such as hovering, selection and “touching” before they move to the menu and visualizations screen. Secondly, the facts displayed follow the martini glass narrative, introducing the visitor to the issue of suicide in Ireland with some specific memorable facts 6.1. Moving from the facts screen, the visitor is now presented with a menu screen. This menu screen displays four buttons for interaction. Three of those buttons are to allow the visitor to select which gender relates to them and to move to the line graph visualization. The other button is to enable the visitor to move to the coxcomb visualization. An arm like interaction aid for the menu screen was developed to aid the user in seeing where their finger is on screen by means of a line and a small dot 6.1. The Swipe gesture is consistent throughout the
37
4.3 Installation Environment
installation to enable the user to return to the menu screen at anytime. From the menu screen the narrative opens out to the body of the glass by means of the line graph and coxcomb diagram visualization. The hardware specifications and Installation set up were as follows. A 40” screen was used to display the installation content while the Leap Motion Controller sat on a table placed in front of the screen. The table was accompanied by a soft cushioned stool to allow the visitor to sit down and interact with the installation at their own ease and time.
38
5 Evaluation Presented here is an evaluation of the project. The purpose of this evaluation is to identify general problems encountered which may be addressed in future research and also to evaluate 3D gestural control as a viable means for interactive data visualization. The evaluation for the 3D gestural control and data visualization will discussed in terms of research presented in Chapter Two 2.4. Furthermore, a brief evaluation of the day of the installation is provided. In terms of evaluating the line graph and coxcomb diagram visualizations, I feel the representation and refinement of the visualizations was carried out to a satisfactory standard. Careful consideration of factors were made for perception and color usage and the encoding of data using pre attentive variables such as shape, size and position served well in providing a clear and compelling visualization of suicide in Ireland. By providing two visualizations from two different perspectives of deaths by suicide in Ireland, by year 1960-2011 and by SEG 20012012, a wide range of data was made available for exploration.
5.1
Problems Encountered
I started this project with a little bit of experience of Java programming, some experience of Processing and a spectators knowledge of data visualization. The learning curve in regards to all aspects of this project was steep and so problems occurred as a consequence of my inexperience. Initially with an interest in learning the C++ programming language, I thought I would develop and implement
39
5.1 Problems Encountered
this project using openFrameworks, an open source framework for creative coding. However it was soon realized that due to poor documentation concerning the use of openFrameworks with data visualization, a lot of time, effort and a few stressful weeks debugging compatibility issues between openFrameworks and Xcode 5 (Mac code editor) and external libraries,openFrameworks was deemed unsuitable for my specific needs for data visualization. Also factoring in my decision to switch software so many weeks in to the project, the open source software Processing kept coming up in research as the tool to use for data visualization and more specifically when it was decided to use Ben Frys Computational Information Design model, Processing was a natural choice as Fry had created Processing with Casey Reas (Fry, 2004). Concerning the Leap Motion device technology, problems arose with regards to performance. During the development and testing of the NUI and the Leap Motion, I soon realized the controller works best in near darkness, otherwise external factors such as bright lights can infer with its field of view and affect performance.This problem was easily solved by taking lights in the environment into consideration and adjusting them accordingly. Fry (2004) commented on the role of the individual practitioner for interactive data visualization needing to refer to different isolated areas of research in order to inform the design of an interactive data visualization. Through the development of this project I found it hard to sometimes balance and integrate all these isolated areas of research in to a meaningful project. At times I would spend weeks concentrating on the technologies, other weeks concentrating solely on the field of HCI and other weeks getting bogged down with trying to implement code for data different representation solutions. This proved problematic as essentially although these areas of research are isolated there needs to be a constant and balanced integration of the areas in order to create a meaningful project. The framework for information visualization as stated in this reports methodology offered a solution to this problem by providing a solid guiding reference for the constant and balanced integration of the different areas. Other minor problems encountered included the design of methods and algorithms for implementing design ideas into code. To help combat and break these problems down, much code was first written out as pseudo code in pencil
40
5.2 3D Gestural Control & Data Visualization
or concepts sketched and then implemented as code in Processing phase by phase 7.
5.2
3D Gestural Control & Data Visualization
There were several challenges identified for the use of 3D gestural control with NUIs for data visualization in this project. Much of the research presented supports the findings from this evaluation and can help point to possible future developments. The evaluation is further supported and based on observations made of visitors using the Leap Motion on the day of the Installation. The use of unimanual gestures for direct manipulation tasks such as selection can cause user fatigue quickly. The input resolution and accuracy of selection can be reduced due to the inexperience of a first time user to the system. Most commonly users were not aware of the Leap Motions inverted pyramid interaction space and so many users were using their hands very close to the device leading to a significant impact on interaction performance and motion tracking. Several times I recommend that users lift their hands further up and then interaction could begin. A possible solution to this could be the introduction of a clear perspex container sitting a few centimeters above the device. The advantage of this would include from a first time users perspective, that they have their hands in the sweet spot of the interaction space and so interaction is instantly more enjoyable and affective. The introduction of such a container could also be beneficial in terms of providing some tactile feedback to the user when they have “touched” a button. Sometimes when users were pushing their fingers towards the screen to initiate a touch event, the acceleration of their pointed finger could initiate a Tap gesture instead of a touch event. These observations allowed me to evaluate if a higher degree of freedom as is the case with the Leap Motion was a hindrance to interactive data visualization or just an new technology still waiting to be explored properly. The higher degree of freedom is good in so far as it reduces the gap between the user intent and the technology. Novel interactions such as the bimanual one developed, do deliver a reliable, interesting, meaningful interaction using 3D gestural control for data
41
5.2 3D Gestural Control & Data Visualization
visualization. However the higher degree of expression meant some visitors did indeed suffer from Immersion Syndrome and frustration. Gestures such as Swipe were wrongly initiated while a user was attempting to perform another action such as simply taking their hand out of the interaction space for a minute, meaning exploration of the data visualization screens could abruptly be interrupted by a return to the menu screen. Possible solutions to this could include designing gestures to be only initiated if a hand moves in a certain direction or if the gesture is integrated with other input modalities such as a voice command. The integration of input modalities could provide a more consistent and reliable interaction as the advantages one technology could complement the advantages of another. This observation and based on my own experience of developing and designing the NUI, although existing HCI models and more specific data visualization taxonomies do serve as good starting point for developing interactions, it would serve the visualization community well if design guidelines were rehashed and defined in terms of different possible input modalities. Currently, as mentioned previously, most interaction design guidelines are for WIMP or desktop based interactive data visualizations. This can prove problematic as guidelines focus on unimanual or mouse based interactions while new input modalities such as the 3D gestural controller presented here in this project provide possibilities for the design of more bimanual focused interactions. Mainly from the observations what can be evaluated is the breakdown of the main aim of the project. Visitors to the installation commonly had never used a 3D gestural controller before and so many were so immersed in using the actual technology and focusing on where their hands where or what they were doing that some missed the opportunity to actually explore the data. While in visualization screens, many commented on the aesthetics of the representations but failed to perform many data exploration tasks due to the distraction of the technology. Although there was a leap help button available for users in the line graph screen to hover over and see instructions on how to use the Leap Motion for data exploration, many users did not use this button. It would be good if the users could learn more quickly as soon as they start interacting with the system what the Leap Motion is about and what kind of gestures they can use to explore. On the Leap Motion Developer Forum, I put a video up of this project
42
5.3 Installation Day
and some feedback was provided on how to address this issue by designing menus for optimal usability and a smoother user experience 6.1. Of those visitors that stayed for longer periods of interaction, they eventually got a real feel for the interaction space and learning the interactions, they soon forgot about the technology and started exploring the data visualization in a “natural” consistent way. Similarly, as an expert user of the system from testing and running it on many occasions, I found by the end of this project, I could fly through interactions and use the system competently for data exploration.
5.3
Installation Day
One of the intentions of this project was to have the installation running in a public space such as the Atrium of the Computer Science building, to encourage random, walk up and use scenarios. Due to hardware reasons, the installation had to be run in the Postgraduate Studio live room. Although the postgraduate studio was open on the day to allow anyone to come in and visit, the room still had to be found and so this defeated the idea of random walk up and use scenarios. The installation set up itself consisted of what was described in the development section (See Appendix C 6.1).
43
6 Conclusion The main aim of this project was to develop an interactive data visualization, placed in a public setting as an installation, in order to create more awareness about the issue of suicide in Ireland. Interactivity would be developed and designed by means of the Leap Motion 3D gestural controller. The development of this project presented many challenges and extensive research revealed complex areas of isolated research concerning the fields of HCI, data visualization and NUIs. In essence the main aim of this project was achieved. Two data visualizations were developed which encoded data acquired from the NSRF and CSO in a clear and compelling way while adhering to guidelines for the use of factors such as color and perception. A narrative was created for the installation in a “martini” glass style to guide the visitor through the installation in a meaningful way. Visitors could cross correlate their own gender and age group by means of novel unimanual interactions and through bimanual interactions explore data at a more detailed level. Many visitors commented on the importance of such projects to increase awareness as suicide has touched so many lives. An evaluation of the project was presented in terms of problems encountered and the use of 3D gestural control for data visualization. Reflecting on this evaluation and referencing research presented, I can arrive at some conclusions. The value of this project can be measured at two distinct levels. Firstly on a personal level and having approached this project with little or no experience in coding a project of this type, and only with a spectators knowledge of data
44
visualization, I have learned a lot. The development of the project allowed me to significantly improve my coding skills, my logical reasoning and also develop my problem solving skills. Extensive research has provided me with a wide range of knowledge on areas such as HCI, data visualization, statistics, data analysis and the exciting field of NUIs. The use of the Leap Motion controller was interesting in terms of development but more so in terms of putting into practice challenges and possibilities raised in my review of relevant research. Secondly, the value of the project can be measured in terms of its contribution to research. Presently research of the use of 3D gestural control for interactive data visualization is small and commonly I came across research that only hypothesized possible solutions to challenges identified. For this project it was of value to be able to apply research in a practical manner and indeed support research findings with actual examples of use. Lee et al. (2012) comment that the goal of the data visualization community should be to pay special attention to the role and design of interaction for new possible input technologies. Data visualization practitioners should be looking at interaction from a novel and broader angle and exploring the possibilities new technologies could offer beyond the use of desktop based WIMP interfaces. However in order to design and develop consistent, reliable and meaningful interactions with these new technologies, a standard or vocabulary of design guidelines/model (Lee et al., 2012) needs to be created to aid the individual data visualization practitioner with the use of new input modalities. Furthermore, more practical applications of research need to be performed in order to truly understand how NUIs could benefit data visualization. The advantage of exploring the potential of NUIs through practical use can help better understand the benefits and limitations of such systems. The exploration of solving issues identified could yield more research questions in terms of use, but also researchers can observe peoples behavior in certain situations in order to truly understand how one might interact with data in a meaningful way.
45
6.1 Future Developments
6.1
Future Developments
For future revisions and developments of this project I would like to explore the possibility of using an interactive tabletop as the display surface rather than 40” screen. The use of a large interactive tabletop in public installation setting could invite more social and collaborative explorations of data. This could be another possible way to create more awareness of the issue of suicide through public social situations. Further exploration of the design of novel interactions for 3D gestural control is needed in terms of bimanual use and also in terms of what other interactions can be designed taking advantage of the Z depth axis of the Leap Motion. Rotation style techniques for data exploration could be explored. The introduction of some sort of tactile feedback as mentioned in future developments could lead to interesting results. The possibility of testing auditory feedback in place of tactile could also be interesting. I have considered for future revisions the Leap Motion controller could work more efficiently for performing data exploration tasks with the integration of other technologies such as voice commands or touch. My research has focused on the development of visual representations of data, it could be of interest to use the same data used for this project to identify patterns or explore data through the use of sound, i.e. data sonification.
46
References Arensman, E., Corcoran, P., Williamson, E., Mc Carthy, J., Duggan, A. and Perry, I. (2013), Second report of the suicide support and information system, Technical report, National Suicide Research Foundation, Cork. URL: http://nsrf.ie/wp-content/uploads/reports/SSISReport2013.pdf 2, 18, 26 Baudel, T., Beaudouin-Lafon, M., Braffort, A. and Teil, D. (1992), An interaction model designed for hand gesture input, Universit de Paris-Sud, Centre d’Orsay, Laboratoire de Recherche en Informatique. 14 Beaudouin-Lafon, M. (2000), Instrumental interaction: an interaction model for designing post-wimp user interfaces, in ‘Proceedings of the SIGCHI conference on Human factors in computing systems’, ACM, pp. 446–453. 15 Behren, C. (2008), The Form of Facts and Figures: Design Patterns for Interactive Information Visualization., Masters, Potsdam University of Applied Sciences, Potsdam. 6, 7, 8, 10, 11 Bertin, J. (1983), Semiology of graphics, University of Wisconsin Press, Madison, Wis. 4 Bhuiyan, M. and Picking, R. (2009), Gesture-controlled user interfaces, what have we done and what?s next, in ‘5th Collaborative Research Symposium on Security, E-Learning, Internet and Networking’, pp. 59–60. 3 Blake, J. (2010), ‘What is the natural user interface? [online], available: http:// nui. joshland.org/ 2010/ 03/ what-is-natural-user-interface-book.html [accessed: March 2014] ’. 12, 13 Boussemart, Y., Rioux, F., Rudzicz, F., Wozniewski, M. and Cooperstock, J. R. (2004), A framework for 3d visualisation and manipulation in an immersive space using an
47
REFERENCES
untethered bimanual gestural interface, in ‘Proceedings of the ACM Symposium on Virtual Reality Software and Technology’, VRST ’04, ACM, New York, NY, USA, pp. 162–165. URL: http://doi.acm.org/10.1145/1077534.1077566 14 Card, S. K., Mackinlay, J. D. and Shneiderman, B. (1999), Readings in information visualization: using vision to think, Morgan Kaufmann Publishers, San Francisco, Calif. 1 Corcoran, P. and Arensman, E. (2010), ‘A study of the irish system of recording suicide deaths’, Crisis: The Journal of Crisis Intervention and Suicide Prevention 31(4), 174–182. 19 Feltron, N. (n.d.), ‘Report App[ online], available: [accessed : January 2014]’. 7
http://www.reporter-app.com/
Ferster, B. (2012), Interactive visualization: insight through inquiry, MIT Press, Cambridge, Mass. 4 Fry, B. (2008), Visualizing data Exploring and Explaining Data with the Processing Environment, O’Reilly Media, Cambridge. 26 Fry, B. J. (2004), Computational information design, PhD thesis, Massachusetts Institute of Technology. URL: http://dspace.mit.edu/handle/1721.1/26913 4, 8, 10, 12, 40 Fry, B. J. (n.d.), ‘Genetics Projects[ online], available: [accessed: September 2013]’. 7
http://benfry.com/genetics
Garber, L. (2013), ‘Gestural technology: Moving interfaces in a new direction [technology news]’, Computer 46(10), 22–25. 12 Guiard, Y. (1987), ‘Asymmetric division of labor in human skilled bimanual action: The kinematic chain as a model’, Journal of motor behavior 19(4), 486–517. 14 Hinckley, K., Pausch, R., Proffitt, D., Patten, J. and Kassell, N. (1997), Cooperative bimanual action, in ‘Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems’, CHI ’97, ACM, New York, NY, USA, pp. 27–34. URL: http://doi.acm.org/10.1145/258549.258571 14
48
REFERENCES
Iliinsky, N. and Steele, J. (2011), Designing Data Visualizations Representing Informational Relationships, ”O’Reilly Media, Inc.”. 9 IMedicalApps (n.d.), ‘Leap motion hands free tech has potential for surgical uses within medicine[ online], available: http://www.imedicalapps.com/2013/07/ leap-motion-tech-surgical-medicine/ [accessed: December 2013]’. 24 Isenberg, P., Carpendale, S., Hesselmann, T., Isenberg, T. and Lee, B. (2012), Proceedings of the Workshop on Data Exploration for Interactive Surfaces-DEXIS 2011, Rapport de recherche. 15 Kavanagh, S. (2012), ‘Facilitating natural user interfaces through freehand gesture recognition’. 14 Kronlachner, M. (2013), The Kinect distance sensor as human-machine-interface in audio-visual art projects | matthiaskronlachner.com, Masters, University of Music and Performing Arts Graz. URL: http://www.matthiaskronlachner.com/?p=1623 14 Kultys, M. (2013), ‘Visual alpha-beta-gamma: Rudiments of visual design for data explorers.’, Parsons Journal for Information Mapping V(1). 11 Kwon, M. (2002), One Place After Another Site Specific Art and Locational Identity, MIT Press, Cambridge. 37 LeapMotion (2013a), ‘ Developer API Overview, [ online], available: https:// developer.leapmotion.com/documentation/java/devguide/Leap Overview.html [accessed: September 2013]’. 23 LeapMotion (2013b), ‘Our Device, [ online], available: https://www.leapmotion.com/ product [accessed: December 2013]’. 23 Lee, B., Isenberg, P., Riche, N. and Carpendale, S. (2012), ‘Beyond mouse and keyboard: Expanding design considerations for information visualization interactions’, Visualization and Computer Graphics, IEEE Transactions on 18(12), 2689–2698. 15, 16, 45 Liu, W. (2010), Natural user interface- next mainstream product user interface, in ‘Computer-Aided Industrial Design Conceptual Design (CAIDCD), 2010 IEEE 11th International Conference on’, Vol. 1, pp. 203–205. 12, 13
49
REFERENCES
Malone, K. (2013), Suicide in ireland 2003-2008, case, National Office for Suicide Prevention, 3TS. 2, 17, 18, 26 Parkes, E. A. (2013), ‘Mode of communication of cholera by john snow, md: second edition - london, 1855, pp 162’, International journal of epidemiology 42(6), 1543. 1 Pecanha, S. (2013), ‘Visual Storytelling at the Graphics Department of The New York Times’, Parsons Journal for Information Mapping V(3). 1 Saffer, D. (2008), Designing gestural interfaces: Touchscreens and interactive devices, ” O’Reilly Media, Inc.”. 13 Samaritans (n.d.), ‘Suicide: facts and figures[ online], available : http://www. samaritans.org/about-us/our-research-0/facts-and-figures-about-suicide [accessed: September 2013]’. 17, 26 Scarr, S. (n.d.), ‘Iraq’s bloody toll[ online], available : http://www.scmp.com/ infographics/article/1284683/iraqs-bloody-toll [accessed : September 2013]’. 7 Scowcroft, E. (2013), Samaritans suicide statistics report 2013: Data for 2009-2011, Technical report, Samaritans. 2, 19 Segaran, T. and Hammerbacher, J. (2009), Beautiful data: the stories behind elegant data solutions, O’Reilly, Farnham. 9 Segel, E. and Heer, J. (2010), ‘Narrative visualization: Telling stories with data’, Visualization and Computer Graphics, IEEE Transactions on 16(6), 1139–1148. 37 Shneiderman, B. (1996), The eyes have it: a task by data type taxonomy for information visualizations, in ‘Visual Languages, 1996. Proceedings., IEEE Symposium on’, pp. 336–343. 15 StatsIndiana (2013), ‘Calculating a Rate, [ online], available: http://www.stats. indiana.edu/vitals/CalculatingARate.pdf [accessed: November 2013]’. 19 Transport for London, W. H. (n.d.), ‘Design classics[ online], available: http://www. tfl.gov.uk/corporate/projectsandschemes/2443.aspx [accessed : November 2013]’. 7 Tufte, E. R. (1990), Envisioning information, Graphics Press, Box 430, Cheshire, CT, Cheshire, Conn. 1, 6, 7, 11
50
REFERENCES
UnderstandingUncertainty (2008), ‘Nightingales Coxcombs [online], available: // understandinguncertainty.org/ coxcombs [accessed: March 2014] ’. 32
http:
Ward, M., Grinstein, G. G. and Keim, D. (2010), Interactive data visualization: foundations, techniques, and applications, A K Peters, Natick, Mass. 7, 10 WorldHealthOrganisation (2013), ‘Suicide prevention (SUPRE) [online],available: http:// www.who.int/ mental health/ prevention/ suicide/ suicideprevent/ en/ [accessed: September 2013] ’. 17 Yi, J. S., ah Kang, Y., Stasko, J. and Jacko, J. (2007), ‘Toward a deeper understanding of the role of interaction in information visualization’, Visualization and Computer Graphics, IEEE Transactions on 13(6), 1224–1231. 15, 16 YouTube (2013), ‘Leap Motion Structured Light Pattern [online], available: http: // www.youtube.com/ watch? v=UI5EBzU QqM [accessed: December 2013] ’. 23
51
Appendix A Developmental Concept Sketches & Screenshots of Developmental Coding
Figure 1: Change to Line Graph
Figure 2: Testing Pink Data Highlights
Figure 3: Circle Packing
Figure 4: Chord Diagram
52
REFERENCES
Figure 5: Coxcomb Diagram Development
Figure 6: Refining the Line Graph
Figure 7: PseudoCode One
Figure 8: PseudoCode Two
53
Figure 9: Pseudocode Three
Figure 10: Pseudocode Four
54
Appendix B Screenshots of Developed Data Representation & Interface
Figure 11: Ambient Screen -
Figure 12: Suicide Facts Screen -
55
Appendix
Figure 13: Menu Screen -
Figure 14: Line Graph Screen Male -
Figure 15: Line Graph Screen Female -
56
Appendix
Figure 16: Line Graph Screen Both Genders with Reference Lines & Data Highlight -
Figure 17: Line Graph Screen with Age Connection -
Figure 18: Coxcomb Diagram of Social Economic Groups of Deceased 2001-2012 -
57
Appendix C Installation Day Setup & Other
Figure 19: Installation Setup - Picture taken with lights on, but the installation itself ran in darkness
58
Appendix
Figure 20: Closeup of Installation -
59
Appendix
Figure 21: Another Closeup of Installation -
Figure 22: Leap Motion Developer Forum Feedback -
60