Two main usability problems arise from the QAL pattern in web chat. First ..... Extensive research on improving call center efficiency has been conducted [2, 13, ...
Adding Structured Data in Unstructured Web Chat Conversation Min Wu, Arin Bhowmick, Joseph H. Goldberg Oracle Corporation 500 Oracle Parkway, Redwood Shores, CA 94065 {min.wu, arin.bhowmick, joe.goldberg}@oracle.com ABSTRACT
resolution, average talk time, and incident handling time).
Web chat is becoming the primary customer contact channel in customer relationship management (CRM), and Question/Answer/Lookup (QAL) is the dominant communication pattern in CRM agent-to-customer chat. Text-based web chat for QAL has two main usability problems. Chat transcripts between agents and customers are not tightly integrated into agent-side applications, requiring customer service agents to re-enter customer typed data. Also, sensitive information posted in chat sessions in plain text raises security concerns. The addition of HTML form widgets to web chat not only solves both of these problems but also adds new usability benefits to QAL. Forms can be defined beforehand or, more flexibly, dynamically generated. Two preliminary user studies were conducted to compare these paradigms. An agent-side study showed that adding inline forms to web chat decreased overall QAL completion time by 47 percent and increased QAL accuracy by removing all potential human errors. A customer-side study showed that web chat with inline forms is intuitive to customers.
Question/Answer/Lookup (QAL) is the dominant communication pattern in the agent-to-customer chat in customer relationship management (CRM). In the question phase, an agent asks a customer a question (for example, “May I have your name and your company’s name?”); in the answer phase, the customer replies to the agent (for example, “My name is Leslie Hahn, and my company’s name is Kingos.”); and in the lookup stage, the agent uses the submitted answer in the chat session to look up information (for example, to find the corresponding customer record) or to perform other actions (for example, to create a new customer record if the customer is not already in the database).
Author Keywords
Web Chat; CRM; Human Performance ACM Classification Keywords
H.5.2 [Graphical user interfaces (GUI)]: User Interfaces INTRODUCTION
Web-based live chat, a standard communication channel in social network and collaboration, is becoming the primary customer contact channel for companies, especially in sales and customer service. Fifty-eight percent of US consumers have interacted with an e-retailer using web chat [5]. A well-implemented chat strategy can transform a business’s ability to drive sales, increase productivity, achieve operational savings, and deliver an excellent customer experience. Moreover, web chat is a jumping-off point for advanced web collaboration (such as co-browsing and remote control) between agents and customers to further improve the support metrics (such as first-contact Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. UIST’12, October 7-10, 2012, Cambridge, MA, USA. Copyright © 2012 ACM 978-1-xxxx-xxxx-x/xx/xx... $10.00.
Two main usability problems arise from the QAL pattern in web chat. First, because chat transcripts (the text content in the chat session) are unstructured and not integrated into other system applications, in order to use the data typed by a customer, copy-and-paste functionality or duplicated typing across different user interfaces is required. For example, when identifying a customer, an agent asks for the customer’s full name and company name and has to copy and paste the customer’s response from the chat window to the customer profile management interface. Sometimes, the free-form chat response from a customer has to be mentally processed by the agent and then confirmed with the customer. As another example, if a customer types that her company should be available “next Wednesday” for an onsite visit, in order to follow the grammatical rules specified by the agent’s company, the agent has to figure out the exact date, confirm it with the customer, and enter it into the company’s calendar to create an event. Unstructured chat data that requires the data re-entry among different interfaces makes web chat inefficient and errorprone. The second usability problem pertains to security and privacy concerns because chat transcripts are in plain text. Even though the security of chat sessions is guaranteed by HTTPS, concerns arise when customers have to enter sensitive information such as their PINs or the last four digits of their social security numbers in plain text in order to verify their identities [6, 12]. In order to solve these two usability problems, we propose the addition of form widgets to web chat conversation, as shown in Figure 1. Instead of asking questions in free-form
data processing, and can be extended as a collaboration tool that supports finegrain joint form filling and control sharing. Second, we conduct two preliminary user studies in the customer service context to show that compared with text-only communication, chat with inline forms improved the QAL efficiency by 47 percent and removed all potential human errors from the agent side. Additionally, we show that customers find chat with inline forms to be an intuitive means of communication when responding to agent questions. RELATED WORK
Different approaches have been taken in order to address the two usability problems presented in this paper. Pre-Chat Survey
Pre-chat surveys [1] with sets of questions presented to customers before chats start have been used to collect necessary information to identify customers and their problems. However, pre-chat surveys are known to create a barrier between agents and customers, potentially causing up to 39 percent of customers to abandon the chats at that point [1, 15]. Asking personal information before the customers actually use the online service is expected to be the main reason that causes the high abandonment rate [3]. Second, these Figure 1: Web Chat with Inline Form for QAL surveys cannot be used during chat sessions. Third, the questions covered text, an agent sends a customer a web form as a chat by pre-chat surveys have to be defined beforehand so that messages. The customer then completes the form in the chat when agents want to ask questions that are not in the prewindow. Clicking the Submit button in the form sends the chat survey, they have to use text chat. completed data from the customer to the agent as another chat message. Inline forms structure customer input data so Sensitive Information Submission that it can be interpreted and used by system applications To address security concerns from customers, some automatically. In Figure 1, the system automatically websites advise customers not to enter personal or sensitive identifies and verifies the customer after form submissions. information into web chat [14]. Instead, customers are Agents do not need to manually re-enter or copy and paste asked to use a pre-chat survey or are redirected to a customer data across different interfaces. separate SSL-protected web page. Other websites provide a We make two main contributions with this paper. First, we design and develop web chat with inline forms to facilitate the data transfer between two parties during QAL. Form can be created either by selecting pre-defined forms or, more flexibly, by dynamically generating a form using existing form widgets from other interfaces. Our chat application creates a single coherent view of the chat conversation, supports efficient data input and automatic
web chat service only to authenticated customers but not to the general public [11], which limits the usage of web chat. LivePerson uses pattern matching to mask certain types of sensitive information (for example, US Social Security Numbers and credit card numbers) in chat transcript display and storage [8]. However, masking does not work for sensitive information that lacks a specific pattern, such as account IDs and passwords.
Data Integration from Chat
WEB CHAT WITH INLINE FORMS
As a text-based communication channel, chat generally has its content opaque to the underlying computer system. Facilitating machines to understand chat content would help data integration from chat to other applications and thus achieve software or web automation triggered from chat channels.
Web forms can structure the chat data transmitted in the QAL pattern, making it directly available to other applications.
Natural language processing (NLP) can help systems understand live chat content, but is currently used only for certain data types in certain tasks, such as an NLP tool that can schedule meetings during a chat session by determining meeting participants and available times from unformatted chat content [19]. A more generic approach is needed to communicate structured data among chatters. Semistructured messages [9] used different message templates to create collaboration objects such as calendar events and group tasks. This proposal concentrated on the backend automation of structured messages. Collaborative data objects (CDOs) can be added into chat sessions to enhance collaboration [17]. CDOs are created, viewed, and edited by launching a separate form-based dialog window from the chat window, which, however, fragments the chat conversation and requires users to switch windows during chat. More seriously, the data communication in the separate dialog window is lost from the chat transcript, causing an incomplete chat conversation. Also, the supported types of CDOs have to be defined beforehand, causing the same issue as the pre-chat survey. PlayByPlay (PBP) is a web collaboration tool that provides two mechanisms for data communication during chat [16]. First, User A can select a portion of a web page and send it as a clip to User B through the browser’s PBP sidebar. The clip appears as an image in the chat transcript, and an interactive HTML version of the clip can be opened in a new browser tab. Second, a question-answer interface generates and sends a question to User B when User A clicking a question mark next to a form input field. Whatever information that User B types then automatically appears in that input field. Like the CDO, the PBP clip feature cannot record the clip interaction in the chat transcript. Moreover, the clip feature is mainly used for User B to observe, correct, and confirm web activity (such as filling in a form) from User A, while control sharing, where User B can trigger an action from the clip (for example, by clicking a button) on either user’s behalf, is not supported. As for the question-answer interface, since it transforms input fields into text messages, User B must type the answer, and cannot take maximum advantage of existing form widgets to improve input efficiency. Moreover, when in the answer mode, User B can send only the answer to the prior question. In case User B needs clarification of the question that she is about to answer, she must cancel the answer mode, return to regular chatting mode, and wait for User A to resend the question.
Form as Chat Messages
In the question phase, instead of typing a question in plain text, an agent forwards a form to a customer. The form is posted as a chat message into the chat transcript, ready for the customer to complete. In the answer phase, the customer completes the form that appears in the chat transcript and clicks the Submit button. The submission generates a new chat message with a list of read-only name-value pairs with the customer input data. Because form data is structured, the lookup phase can be automated. At the end of the lookup phase, the lookup results as well as any necessary action buttons (for example, a Create Customer button if the current customer cannot be identified), are appended at the end of the form submission message in agent’s view of the chat transcript. Treating form forwarding and form submission as separate chat messages guarantees a single complete view of the chat session for both the agent and the customer. Form interaction during the chat is no longer lost from the conversation archive. Second, because the form submission is in the chat transcript instead of the chat input area, the customer can type and send other text messages (such as a clarification request) while she is facing the form to complete. Third, form re-submission is straightforward. The customer can generate a new chat message by entering different data and re-submitting the form. Usability Enhancement in CRM Chat
In addition to automating the lookup phase without requiring agents to manually “move” data between the chat window and other applications, web forms in chat enhance the user experience of the QAL pattern in CRM chat by exploiting the existing form features. First, web forms can mask sensitive data submission. Therefore, the same sense of security gained from a web form submission can be achieved during a chat session. The
Figure 2: Input Efficiency Improvement in Smartphone
system process that uses customer’s sensitive data can be automatically launched without revealing data to the agent, as shown in the verification flow in Figure 1. Second, existing form widgets, such as a date/time picker, choice list, or input with auto-complete, can be leveraged to improve the input efficiency at the customer side. For example, instead of typing in a long string of the model number of her problematic printer, a customer can choose the model from a choice list populated by the model numbers from all the printers that he has bought. Moreover, customers need to type in only the required values (for example, “John Smith”) without the extra typing to describe what the values are about (for example, “my name is John Smith”). Such input efficiency improvement can be significant to the customers who are using mobile devices to chat because typing in mobile devices is slow and unwieldy. As shown in Figure 2, picking a value from a choice list (right) is much easier and faster than typing in text (left). Auto-complete can be added to further improve the efficiency when the choice list is long. Additionally, by applying new input types from HTML 5 into the form, the soft keyboard in the mobile device can be automatically adjusted based on the focused input field. Third, data validation supported by forms can automatically detect certain types of input errors from customers and thus prevent invalid data from being submitted, which is impossible with text-based chat. Form-based data validation not only reduces network traffic, but also relieves agents from manually validating customer data. Form Creation
We propose two mechanisms to create forms and insert them into chat. Predefined Forms
CRM agents can use predefined forms. With a list of frequently asked questions, and a form is designed for each question beforehand. At the front end, appropriate form widgets are chosen with proper input masking (for sensitive data) and validation, while at the back end, the automatic system flow is defined to consume the customer’s form submission.
Figure 3: Form Creation Using Predefined Form When a form is inserted, agents can tweak the form before forwarding it, such as typing more help text above or below the form if they determine that the form is not quite selfexplanatory. If agents change their mind, they can close the form to undo the insertion. Eventually, pressing the ENTER key posts the form into the chat transcript. Dynamic Form Generation
Inserting predefined forms is limited because all the forms have to be designed beforehand. We propose a more flexible approach to solve this limitation, where agents push input fields or sets of input fields from other applications in the agent’s console into the chat input area. A new form is dynamically composed using the input fields that have been pushed. Chat-aware inputs are the input fields from the agent’s console that can be pushed into chat. As shown in Figure 4, each chat-aware input field is marked with a special icon. Clicking the icon pushes the input field along with its label into the chat input area. The pushed input field is then highlighted, indicating that the chat is expected to supply its value. Agents can edit the label of a pushed input field in case the label is missing or if the existing label may not be understood by the customer. Agents can type help text before or after each pushed input field, and they can undo the push by using the close icon associated with it.
During the chat session, those predefined forms can be inserted into the chat input area, either by clicking an Insert button or, more efficiently, by typing a special prompt in the chat input area (Figure 3). Moreover, when agents see a predefined form in the agent’s console that needs answers from the customer, they can click the associated button to push the form into the chat input area without selecting it from the predefined form list.
When agents are satisfied with the dynamically generated form in the chat input area, they presses the ENTER key to post the form as a chat message into the chat transcript. After the customer completes the form and submits the data, the submitted values are automatically populated into the corresponding input fields on the original form in the agent’s console. No copy and paste or re-entry is necessary. Agents need only to click the action button on the agent’s console to continue.
The list of predefined forms to insert can be dynamic and context sensitive. For example, before a customer is identified, all the customer-specific forms can be hidden. This list can thus serve as handy guidance for agents to follow when they solve customer problems.
In the CRM environment, dynamic form generation can be combined with predefined forms. Repetitively pushing certain input fields into chat could suggest a new predefined form.
By pushing only certain form fields or control widgets into a chat session, the other chat party may not be able to get the larger context in order make an informed decision about providing certain information or performing certain actions. The ATG PagePeek approach [11] could provide this context by displaying a read-only version of the web page from which the form fields or controls are pushed.
Figure 4: Dynamic Form Generation Web Chat with Inline Forms as Collaboration Tool
Dynamic form generation is not only a significant improvement over CDO-enabled chat [17]. It also extends our solution as a collaboration tool for fine-grain joint form filling and control sharing.
USER STUDY I: AGENT SIDE
Joint form filling is a CRM web collaboration tool [4]. It enables both customers and agents to view and access the same forms. Joint form filling frequently needs cobrowsing, wherein customers share their screens with or grant remote access to their computers to agents. Our chatwith-inline-form design in the customer service context takes a reverse approach of joint form filling, meaning that customers help agents to complete a form at the agent side without co-browsing.
Methodology Tested Interfaces
Similar to PBP, our chat application can be placed in browser sidebars. With inline form enhancement, any form field from any website is chat-aware and can be pushed into the chat sidebar so that web forms can be filled collaboratively. In addition to form fields, other controls or widgets available in web interfaces can also be pushed into chat. Chat therefore becomes a novel lightweight control sharing tool that improves end-user productivity. Consider the following scenarios. User A wants to open a document from the enterprise document sharing application, but the document is locked by User B. User A could ask User B to stop his current work, open the same application, go to the same section that she is already in, and unlock the specific document that User A wants to open. However, with fine-grained control sharing, User A has a more efficient option by pushing the document entry and the Unlock button from the interface into the chat. User B can then click the Unlock button in place in the chat session to unlock the document. Chat, in this example, provides a shortcut to the exact control point for users to perform certain actions with their appropriate privileges. This finegrained control sharing is more efficient and less error prone for users of mobile devices. Mobile users do not have to navigate through multiple pages to find and launch the appropriate action. Instead, they can leverage the in-context shortcut to perform the same action.
In order to determine the degree to which chat with inline forms can improve the efficiency and accuracy of QAL at the agent side, we ran a controlled experiment.
Three chat interfaces were tested. Users either typed text (Text) or inserted one of the five predefined questions in the format of either plaintext (Question) or inline form (Form) into the chat input area. Then they pressed the ENTER key to post the text or the form into the chat transcript. The five predefined questions were specific for the five study tasks. Tasks
Each 30-minute study session included five typical agent tasks focused on the QAL pattern, in which an agent asked some questions (Q) and used the submitted data to conduct lookup (L) either manually (using Text or Question) or automatically (using Form). The task selection was based on a year-long set of service and support department observational studies at more than 10 companies and a survey of 90 global call center agents. The tasks were:
Identification (ID): An agent asked a customer her name and company name (Q) and then searched for the customer (L). Verification (VE): After the customer was identified, an agent asked for the customer’s account ID and PIN as the security questions (Q), and then matched the answer against the system record (L). Checking a service request status (SR): An agent asked for the service request numbers and the associated asset numbers (Q), and then searched the service request (L). Finding a solution for a technical problem (SL): An agent asked for the machine model and the problem details (Q), and then searched for the solution (L). Scheduling an onsite visit (OV): An agent asked a customer for a convenient date for the onsite visit (Q), and then used the customer’s suggested data to schedule the visit in the system calendar (L).
Study Implementation
Participants
A web chat prototype was built using jQuery, PHP, and MySQL. In order to insert forms in the chat input area, we customized CKEditor (http://ckeditor.com/), a rich text editor that can be used inside web pages. The chat messages with their senders and their post timestamps were stored in a MySQL database on the server. The browser pulled new messages from the server every second and displayed these messages in the chat transcript area.
Eighteen professional colleagues (nine males and nine females) with online chat experience participated in the study. Each had previously performed tasks similar to customer service agents in interacting synchronously with customers.
The agent console contained a chat interface and a work area where the agent could search customers, service requests, and solutions, and schedule onsite visits. The customer side had only a chat interface in a browser. Study Procedure
We hypothesized that the Form user interface (UI) was significantly faster than both the Question UI and the Text UI across all the five QAL tasks, and thus we measured the completion time of each task. Participants were assigned the personas of customer service agents from Printana Inc, a fictitious printer company. They were asked to strictly follow a printed agent manual that clearly described the steps and communication rules for each task. The experimenter acted as the customer, chatting in real time from a separate computer, using a script to insert appropriate messages into the chat with minimal latency. The customer always posted the same chat messages within each task script:
Experimental Design
A within-participant design was used. The three chat interfaces were exposed to participants in one of six possible orders (Text-Question-Form, Text-Form-Question, and so on). The three specific tasks (SR, SL, and OV) were sent to participants in one of three counterbalanced orders (SR-SL-OV, OV-SR-SL, SL-OV-SR). A total of 18 experimental conditions (six chat interface orders × three task orders) were presented across the participants. In each session, both ID and VE were done three time with all three chat interfaces (Text, Question, and Form), but SR, SL, and OV were done once, each with a different chat interface. Results Efficiency
Task completion times were analyzed statistically to determine the impact of each chat interface on each task. A total of 162 observations were made in this study. Fiftyfour observations were from ID and VE each because each of the 18 participants did these two tasks with all three chat interfaces. Eighteen observations were from SR, SL, and OV each because each participant did these three tasks only once. Three invalid observations were removed because three validation tasks failed when the participants passed the verification step and continued with the invalid security answers from the customer. As a result, 159 valid observations were available for statistical analysis.
ID: The customer provided the correct identification information (name and company name) on the first time. VE/SR/SL: On the first response, the customer provided either an incorrect account ID, service request number, or printer model number, which led to verification failure or no matched result. When asked again, the customer Two-factor ANOVA on completion times considered both provided the correct information. chat interface and task type as main effects. Significant differences among factor levels were explored using OV: The customer’s first suggested date was Tukey’s pairwise comparisons (α=.05). The chat interface unavailable. When the agent asked for another date, the was quite significant (F2, 144 = 43, P < 0.001), with the Form alternative date was available. When the customer input UI resulting in the shortest completion time (mean=57 the suggested dates in plaintext, a relative date was used seconds), which was significantly faster than both the (that is, “Let’s say next Friday.”). As one of the Question UI (88 seconds) and the Text UI (108 seconds). communication rules, the agent had to translate the Figure 5 shows the mean and the standard deviation of the relative date to the exact date and confirm it in the month-day-year format (that is, “Do you mean September 23, 2011?”). Task completion time was determined by the difference between the task start time when the experimenter clicked the Start button and the end time when the participant typed or posted an ending message into the chat transcript. Ending messages included “Thank you” after the customers were verified (ID and VE tasks), or the request status, solution, or schedule confirmation (SR, SL, and OV tasks). Figure 5: Task Completion Time with Different Chat UIs
completion time of each task by each chat interface. The Form UI reduced the agent QAL time by 47 percent compared with the Text UI, and by 35 percent compared with the Question UI. Accuracy
Three verification failures and two input errors were made using the Text or Question UI, and date confusion was also observed. However, no errors were made with the Form UI. Three verification failures were observed when customer answers were incorrect (one with Text UI and two with Question UI), with no confirmation by the participants to make sure that customer security answers matched those in the system. Two data entry errors were made (both Text and Question UIs) by re-entering the customer name with a typo, and copying the wrong service request number from the chat transcript to the search interface. Four of 12 participants using the Text or Question UI confused in scheduling the onsite visit date from a relative date input. They interpreted “next Friday” to mean the Friday in the current week, which resulted in additional required confirmation time with the customer. Discussion Efficiency
The Form UI was most efficient across the tasks, because the submitted form data was already structured. The identification and verification processes were automatically loaded with customer data without the agent’s involvement (ID and VE), and the agent was automatically notified regarding successful customer identification and verification. Customer data was populated from form to the agent’s work area (SR, SL, and OV), and clicking an action button immediately searched for a service request or solution, or scheduled an onsite visit. Extensive research on improving call center efficiency has been conducted [2, 13, 18]. A one-minute reduction per call is significant for call centers with at least 1 million calls each year [2]. An efficiency gain of 47 percent by chat agents in QAL is thus very significant in CRM. Accuracy
The Form UI was error-free mainly because the lookup phase was automated. As a result, verification did not require agent involvement, customer’s input was either automatically used by the system or directly populated into the agent’s search interface, and the date picker used for date entry didn’t require any additional date translation. Customer service chat agents make errors, due to intense workloads, working multiple chat sessions in parallel [7]. This agent-side study showed that chat with inline forms minimizes the chances for potential errors, which should dramatically improve agent performance. USER STUDY II: CUSTOMER SIDE
Unlike customer service agents who are oriented to maximizing efficiency, customers using web chat to obtain
help are more concerned with learnability and satisfaction. A formative customer-side study was conducted to determine whether customers could understand and correctly use an inline form that has been posted within a chat transcript by agents. Methodology
Two chat interfaces were tested: one where customers could use inline forms in the chat transcript (Form), and one without this support (Text). Participants, given the persona of an IT support specialist to solve a printer problem, initiated two chat requests to a customer service agent at Printana Inc. One request was to find a solution to a technical problem, and the other was to schedule an onsite visit (as for the SL and OV tasks in Study I). Participants had to first pass identification/verification steps, as well as their account IDs and customer PINs. The experimenter acted as the agent, chatting with the participants from a separate computer in real time. Eight professional colleagues (five males and three females) with online chat experience participated in the study. The study design included four experimental conditions (two chat interface orders × two chat request orders), and two participants were assigned to each condition in a counterbalanced way. Each session lasted 30 minutes. Results and Discussion
All participants found the Form UI to be intuitive; requested data was submitted to the agent without difficulty. Five participants preferred the Form UI. Four were more comfortable providing sensitive information because the secure form masked the account ID and the customer PIN, compared with the Text UI that exposed the sensitive information in plain text. Four felt the inline form provided more structured and well-defined guidance; e.g., they did not need to type the label for the provided data (for example, “First name is …”). Two participants felt the date picker was easy to use and that they wanted to have the same auto-suggestion support for all input fields whose values were hard to type. For example, after a customer is identified, the printer model input field should have a dropdown list with all the models that the customer had purchased. However, two preferred the text chat over the chat with inline forms. They thought that forms in chat were too impersonal and made them think that they were talking to a “chat bot.” Conversely, another mentioned that he did not care about impersonal chat as long as his issue was resolved. Sometimes, customers provided required information before the form was forwarded to them, causing agents to re-enter the data from the chat transcript into the search interface. The inline form should be an add-on feature to the text chat. If an agent notices that certain customers do not like the inline form or is inappropriate, the agent does not have to use it and can stick with text chat. The customer-side user
study has shown that chat with inline forms is intuitive to customers and that they can immediately use the forms without any difficulty or training. CONCLUSION AND FUTURE WORK
We propose to embed inline forms into web chat so that the structured form data can be automatically interpreted and used by the system. Powered by inline forms, web chat is no longer a separate standalone tool, but can be presented as a tightly integrated input channel directly interfacing with the underlying applications. Data in the chat transcript does not have to be re-entered or copy-and-pasted into other interfaces. Moreover, chat with inline forms provides a sense of security for customers who need to share sensitive information. We showed that chat with inline forms can improve the experience of web chat for both agents and the customers in CRM. The agent-side study showed that chat with inline forms dramatically improves the agent’s efficiency and accuracy, while the customer-side study showed that chat with inline forms is intuitive and can be used without any difficulty or training. The next phase of the agent-side study will include customer service agents performing the tasks in their real work environments and locations. Chat with inline forms can be extended from the CRM agent-customer environment to other collaboration environments, supporting joint form filling and control sharing. Future work will implement and evaluate a browser-based chat interface as a browser sidebar, as well as assess its security, trust, and cultural implications. ACKNOWLEDGEMENTS
This research was funded by the Applications User Experience organization at Oracle. The authors would like to thank Sam Ting, Rami Musa, and Allison Farrell for their interaction and visual design contributions. The opinions expressed are solely the authors’ and do not necessary reflect those of Oracle. REFERENCES
1. Bold Software. Live Chat Performance Benchmarks. October 2009. http://www.boldchat.com/live_chat_software/files/Live_ Chat_Performance_Benchmarks_Oct_2009_I.pdf. 2. Bond, C. Time Is Money: Impacts of Usability at a Call Center. Ergonomics in Design: The Quarterly of Human Factors Applications Vol. 15 No. 4: 17-22. October 2007 3. Chellappa, R and Sin, R. Personalization Versus Privacy: An Empirical Examination of the Online Consumer’s Dilemma. Information Technology and Management Vol. 6, No. 2–3: pp 181-202. 2005.
http://www.internetretailer.com/2011/05/10/live-chatuse-rise-survey-says. 6. Flavián, C. and Guinalíu, M. Consumer trust, perceived security and privacy policy: Three basic elements of loyalty to a web site. Industrial Management & Data Systems Vol. 106, No 5: 601-620. 2006. 7. Gliedman, C. The ROI of Interactive Chat. Forrester Research. February 2008. http://www.moxiesoft.com/uploadedFiles/tal_resources/ ForrestWP_ROI_of_chat.pdf. 8. Live Person. Security Model Overview. January 2011. http://www.liveperson.com/sites/default/files/pdfs/LiveP erson%20Security%20White%20Paper.pdf. 9. Malone, T., Grant, K., Lai, K.-Y., Rao, R., and Rosenbiltt, D. Semistructured Messages Are Surprisingly Useful for Computer-Supported Coordination. Proceedings of ACM Conference on Computer-Supported Cooperative Work 1986: 102-114. 10. Oracle | ATG. Live Help on Demand – Agent Console User Manual. July 2011. 11. Provide Support LLC. The Benefits of Live Chat for ECommerce. http://www.providesupport.com/aboutus/articles/benefits-live-chat-ecommerce.html. 12. Ray, S., Ow, T., and Kim, S. Security Assurance: How Online Service Providers Can Influence Security Control Perceptions and Gain Trust. Decision Sciences Vol. 42 No. 2 pp. 391-412. May 2011. 13. Sapounakis, E. Efficiency: The second essential of a customer centric business. UX Magazine. August 2011. http://uxmag.com/articles/efficiency. 14. Security4web. Instant Messaging and Chat Rooms: DO's and Don't's. http://www.security4web.org/page.php?id=25. 15. Tabibian L. Understanding Missed Chat Opportunities and How to Reduce Them. May 24, 2010. http://community.liveperson.com/docs/DOC-1242. 16. Wiltse, H. and Nichols, J. PlayByPlay: Collaborative Web Browsing for Desktop and Mobile Devices. Proceedings of the 27th international conference on Human factors in computing systems, 2009: 1781-1790. 17. Winkowski, D. and Krutsch, M. Collaborative Data Objects Enhanced Chat in Support of Net-Centric Collaboration - Collaborative Technologies for Network-Centric Operations. 13th ICCRTS: C2 for Complex Endeavors. June 2008.
4. eGain. Chat and Cobrowse. eGain Best Practice Series. 2011.
18. Zapata, C. Evaluating UX by Measuring Task Efficiency. UX Magazine. June 2011. http://uxmag.com/articles/evaluating-ux-by-measuringtask-efficiency.
5. Enright, A. Live chat use is on the rise, survey says. May 10, 2011.
19. Zurko, M. E. Instance Messaging Auto-Scheduling. U.S. Patent 7,853,471 B2. December 2010.