Quality of Experience and Quality Feedback - DiVA portal

29 downloads 0 Views 719KB Size Report
Some User Reactions (2). • If it's slow I won't give my credit card number. • As long as you see things coming up it's not nearly as bad as just sitting there waiting ...
HET-NETs’06 Tutorial T13 Ilkley, UK, Sept. 2006

Quality of Experience and Quality Feedback

Markus Fiedler Blekinge Institute of Technology School of Engineering Dept. of Telecommunication Systems Karlskrona, Sweden Markus Fiedler: Quality of Experience and Quality Feedback

1

My Own Background (1) Moved from the network towards the user ☺ • Working with Grade of Service/Quality of Service issues since 1992 – Admission control, dimensioning • Got interested in end-user throughput perception in 2000 – “Kilroy”-Indicator 2002 co-developed with Kurt Tutschku, University of Würzburg • E-Government project 2002—2004 – Implications of IT problems • Preparation of the NoE EuroNGI 2003

Markus Fiedler: Quality of Experience and Quality Feedback

2

EuroNGI-Related Activities • Leader of – Joint Research Activity JRA.6 “SocioEconomic Aspects of Next Generation Internet” – Work Package WP.JRA.6.1 “Quality of Service from the users’ perspective and feedback mechanisms for quality control” – Work Package WP.JRA.6.3 “Creation of trust by advanced security concepts” • EuroNGI-sponsored AutoMon project (2005) – Improved discovery of end-to-end problems – Improved quality feedback facilities Markus Fiedler: Quality of Experience and Quality Feedback

3

My Own Background (2) • Projects within Intelligent Transport Systems and Services since 2003 – Timely delivery is crucial (dependability, safety) – Network Selection Box (GPRS/UMTS/WLAN) – How to match technical parameters and user perception? • Surprised that rather little attention has been paid to user-related issues by “our” scientific community

Markus Fiedler: Quality of Experience and Quality Feedback

4

Thesis 1: Users do have – sometimes unconscious – expectations regarding ICT performance

Markus Fiedler: Quality of Experience and Quality Feedback

5

Quality Problems?!?

Markus Fiedler: Quality of Experience and Quality Feedback

6

Markus Fiedler: Quality of Experience and Quality Feedback

7

Perception of Response Times

100 ms

1s

10 s

Boring Reacts promptly

There is a delay

Flow of thoughts interrupted

Response time

Uninteresting

• Most users do not care about “technical” parameters such as Round Trip Time (RTT), one-way delay, losses, throughput variations, ...

Markus Fiedler: Quality of Experience and Quality Feedback

8

Some User Reactions (1) • Study by HP (2000) [1] • Test customers were exposed to varying latencies when composing a computer in a web shop and had to rate the service quality • Some of their comments are found below: • Understanding that there’s a lot of people coming together on the process makes us more tolerant • This is the way the consumer sees the company...it should look good, it should be fast

Markus Fiedler: Quality of Experience and Quality Feedback

9

Some User Reactions (2) • If it’s slow I won’t give my credit card number • As long as you see things coming up it’s not nearly as bad as just sitting there waiting and again you don’t know whether you’re stuck • I think it’s great...saying we are unusually busy, there may be some delays, you might want to visit later. You’ve told me now. It I decide to go ahead, that’s my choice. • You get a bit spoiled. I guess once you’re used to the quickness, then you want it all the time

Markus Fiedler: Quality of Experience and Quality Feedback

10

Consequences?

Shortcomings in perceived dependability are likely to cause churn!

[2] summarises: • 82% of customer defections are due to frustration over the product or service and the inability of the provider/operator to deal with this effectively • ... on average, one frustrated customer will tell 13 other people about their bad expeciences ... • For every person who calls with a problem, there are 29 others who will never call. • About 90% of customers will not complain before defecting – they will simply leave once they become unsatisfied.

Markus Fiedler: Quality of Experience and Quality Feedback

11

Quality of Service (QoS) • Telecom view – ITU-T E.800 (1994) defines QoS as “the collective effect of service performance which determine the degree of satisfaction of a user of the service”, including • Service support performance • Service operability performance • Serveability (Service accessibility/ retainability/integrity performance) • Service security performance – QoS measures are only quantifiable at a service access point Markus Fiedler: Quality of Experience and Quality Feedback

12

Quality of Service (QoS) • Internet view – Property of the network and its components • “Switch A has Quality of Service” – Some kind of “Better-than-best-effort” packet forwarding/ routing • RSVP • IntServ • DiffServ • Performance researcher view – Results from queuing analysis

Markus Fiedler: Quality of Experience and Quality Feedback

13

Quality of Experience (QoE) [2, 3] • Rather new concept, even more user-oriented than QoS: “how a user perceives the usability of a service when in use – how satisfied he or she is with a service” [2]. • Includes – End-to-end network QoS – Factors such as network coverage, service offers, level of support, etc. – Subjective factors such as user expectations, requirements, particular experience • Economic background: Dissapointed user may leave and take others with him/her.

Markus Fiedler: Quality of Experience and Quality Feedback

14

Quality of Experience (QoE) • Key Performance Indicators (KPI) – Reliability (service quality of accessibility and retainability) • Service availability • Service accessibility • Service access time • Continuity of service – Comfort (service quality of integrity KPIs) • Quality of session • Ease of use • Level of support • Need to be measured as realistically as possible Markus Fiedler: Quality of Experience and Quality Feedback

15

Thesis 2: There is a need for more explicit feedback to make the user feel more confident

Markus Fiedler: Quality of Experience and Quality Feedback

16

Typical Feedbacks

Cf. [4] Section 2.4 Markus Fiedler: Quality of Experience and Quality Feedback

17

Types of Feedback • Explicit feedback – Positive/negativ acknowledgements • E.g. TCP – Asynchronous notifications • E.g. SNMP traps • Implicit feedback – Can be obtained through observing whether/how a process is happening – Dominating Internet as of today

Markus Fiedler: Quality of Experience and Quality Feedback

18

1. Feedback from the Network a. Network Application • Implicit: No or late packet delivery b. Network Network Provider • Classical Network Management/monitoring c. Network User • Implicit: “Nothing happens...” • Rudimentary tools available • Operating system issues warnings Within the network stack: control packets

Markus Fiedler: Quality of Experience and Quality Feedback

19

2. Feedback from the Application a. Application Application • Some applications measure the performance of the packet transfer and adapt themselves (e.g. Skype, videoconferencing) b. Application User • Implicit by not working as supposed • Explicit by notifying the user or adapting itself c. Application Service Provider • Active measurements of service performance d. Application Network Provider • Monitoring of control PDUs

Markus Fiedler: Quality of Experience and Quality Feedback

20

3. Feedback from the User Implicit: give up / go away = churn Explicit: a. User network operator • Blame the closest ISP • Not uncommon ISP attitudes: • The problem is somewhere else • The user is an idiot b. User service provider • Online quality surveys c. User application • Change settings

Markus Fiedler: Quality of Experience and Quality Feedback

21

4. Feedback from the Service Provider • Towards the network operator in case of trouble • Part of the one-stop service concept [4]: – Service provider = primary point of contact for the user of a service – User relieved from having to search for the problem (which is the service provider’s business)

Markus Fiedler: Quality of Experience and Quality Feedback

22

The Auction Approach

Cf. [5] Chapter 5 Markus Fiedler: Quality of Experience and Quality Feedback

23

Feedback Provided by Bandwidth Auctions a. Bidding for resources on behalf of the user b. Signaling of success or failure c. Results communicated towards the user • Successful transfer at resonable QoS • Unsuccessful transfer at low cost d. Results communicated to network (and perhaps even service) provider • Dimensioning • SLA

Markus Fiedler: Quality of Experience and Quality Feedback

24

The AutoMon Approach

Cf. [5] Chapter 6 Markus Fiedler: Quality of Experience and Quality Feedback

25

AutoMon Feedback • DNA (Distributed Network Agent) = main element in a self-organising monitoring overlay a. Local tests using locally available tools b. Remote tests and inter-DNA communication • Comparison of measurement results c. Alarms towards {network|service} provider(s) in case of perceived problems • E.g. using SNMP traps d. Lookup facilities for providers • E.g. saving critical observations in a local MIB e. Notification facilities towards users • Not mandatory, but maybe helpful Markus Fiedler: Quality of Experience and Quality Feedback

26

The AutoMon Project • Design and Evaluation of Distributed, SelfOrganized QoS Monitoring for Autonomous Network Operation (http://www.informatik.uniwuerzburg.de/staff/automon) • Sponsored by the Network of Excellence EuroNGI (http://www.eurongi.org) • Partners (and Prime Investigators) – Infosim GmbH & Co. KG, Würzburg (S. Köhler, M. Schmid) – University of Würzburg, Dept. of Distributed Systems (K. Tutschku, A. Binzenhöfer) – Blekinge Institute of Technology, Dept. of Telecomm. Systems (M. Fiedler, S. Chevul) Markus Fiedler: Quality of Experience and Quality Feedback

27

The AutoMon Concept 1. DNA = Distributed Network Agent – Self-organising – Prototype available • Network operations • Simulations 2. NUF = Network Utility Function – Quality evaluation: user impairment = f (network problems) – Focus on throughput (TUF) 3. QJudge = Demonstrator for – Quality evaluation (traffic-lights approach) – Feedback generation (traps) – MIB

Markus Fiedler: Quality of Experience and Quality Feedback

28

The Way To Autonomous Networks Autonomous Manager

Analyze

Observe

Act

Input

Output

Autonomous Manager

Autonomous Manager

IT-System e.g. LAN/MAN

Markus Fiedler: Quality of Experience and Quality Feedback

29

Disadvantages of a Central Monitor Station Mailserver

?

?

Client C

Client A

?

?

Webserver NMS

?

Client D

? Backup Server

Link status:

up

?

Client B

down

?

unknown

Markus Fiedler: Quality of Experience and Quality Feedback

30

Advantages of Distributed Monitoring Extended Mailserver view DNA

DNA

Client A

Client C temporary DNS proxy

DNA

DNA

reroute

DNA

Webserver

DNA

NMS

Client D

DNA DNA

Client B

DNS Server

Link status:

up

down

?

unknown

Markus Fiedler: Quality of Experience and Quality Feedback

31

DNA Phase 1: Local Tests

g n i P

!

IP?

le b a C

?

-NIC-Status -NetConnectionStatus -PingLocalHost -IPConfiguration -DNSConfiguration -DHCPLease -EventViewer -HostsAndLmHosts -RoutingTable -PingOwnIP -PingWellKnownHost

DNA

Markus Fiedler: Quality of Experience and Quality Feedback

32

DNA Phase 2: Distributed Tests Ping! DNA

Server

Result Test please

DNA

-PingSpecificHost -PingWellKnownHosts Test -DNSProxy Result please g! n i -RerouteProxy P -PortScan -Throughput -Pinpoint Module

Server

Ping!

DNA

Markus Fiedler: Quality of Experience and Quality Feedback

33

The DNA Overlay Network Use of a P22-based overlay network • DHT = Kademlia • Peer = DNA DNA

DNA

Internet DNA

DNA Challenges: - keep overlay connected - locate specific DNA - locate random DNA

DNA DNA

DNA

Markus Fiedler: Quality of Experience and Quality Feedback

34

Scalability Results Using the DNA Prototype

Average search duration [ms]

500 450

Average online time = 60 min No churn

400 350 300 250 200 150 100 0

200

400

600

800

1000

1200

Overlay size Markus Fiedler: Quality of Experience and Quality Feedback

35

Network Utility Function

UOut = UNetw • UIn UIn

UNetw

DNA DNA

Internet evaluate original quality

evaluate received quality evaluate quality of the network

Markus Fiedler: Quality of Experience and Quality Feedback

36

Network Utility Function

UOut = UNetw • UIn • Range of U: 0 (worst) ... 100 % (best) – intuitive for – Users – Providers – Operators • Captures performance-damping effect of the network – UNetw = 1 network “transparent” • Bad service perception (Uout 0) can have its roots in – Badly performing network (UNetw 0) – Badly performing application (Uin 0)

Markus Fiedler: Quality of Experience and Quality Feedback

37

Throughput Utility Function

UNetw = Um • Us • Un • Basis: Throughput – on small time scales ΔT – during observation interval ΔW • m-utility function Um: captures impact of changes in traffic volume – Overdue traffic ( late or lost) • s-utility function Us: captures impact of changes in traffic burstiness – Shaping = reduction ( throttle) – Sharing = increase ( interfering traffic) • n-utility function Un: – Bias by network (e.g. UMTS vs. LAN)

Markus Fiedler: Quality of Experience and Quality Feedback

38

Recent Skype-via-UMTS results: PESQ and NUF/TUF [6]

l

PESQ = Perceptual Evaluation of Speech Quality NUF = Network Utility Function TUF = Throughput Utility Function Markus Fiedler: Quality of Experience and Quality Feedback

39

SNMP Interface • Trap generation – Upon threshold crossing, e.g. • Green Yellow • Yellow Red • (Enterprise) MIB – Not yet designed – Cf. RMON history group • Statistics (m, s)? • Array with values? • Histograms?

Simple parameters for monitoring of Skype:

UNetw = Um ≥ 80 % UNetw = Um ≥ 50 % UNetw = Um < 50 %

• Why not just participate in the overlay? ;)

Markus Fiedler: Quality of Experience and Quality Feedback

40

Thesis 3: The user needs to be relieved from decisions based on incomplete feedback

Markus Fiedler: Quality of Experience and Quality Feedback

41

Status Internet usage still implies a high degree of self-service • Some kind of Internet paradigm (just provide connectivity, the rest is left to the user) • The “Anything-over IP-over-anything” principle provides both opportunities and nightmares • Mastered differently by different applications (better by some, worse by others) • A lot of “decision making” is left to the user – does (s)he really know about the implications? • Recent trend towards IMS (Internet Multimedia System): might help, but will the Internet community accept it?

Markus Fiedler: Quality of Experience and Quality Feedback

42

Status

Solving these issues increases dependability perception and thus trust

Issues: • How do subjective QoE and objective QoS parameters match each other? – More or less solved for some applications • How can I be sure that – “my” task is performed and completed – “my” problems are detected and worked on in time? • Which network can be used for a particular task? – Rough indications available • “Money back” policies? – cf. airlines and (some) train companies Markus Fiedler: Quality of Experience and Quality Feedback

43

Wish-list • No additional complexity for the user! – Application of self-organisation principles • Preventive feedback: – Clear guidelines and indications regarding (im-)possibilities • Optional cross-layer interfaces required • Reactive feedback: – Signalling of success or failure • Again a matter of cross-layer interfaces – Action on behalf of the user • Notifications • Selections (e.g. a particular network) Markus Fiedler: Quality of Experience and Quality Feedback

44

Wish-list (continued) • The Internet community should care about end user perception – tendencies visible: – Next Generation Internet – Internet2 – GENI initiative • Performance researcher should care about the end user – What is the use of your studies? – How can you relate your results to user perception?

Markus Fiedler: Quality of Experience and Quality Feedback

45

References 1. A. Bouch, A. Kuchinsky, and N. Bhatti. Quality is in the eye of the beholder: Meeting user's requirements for Internet quality of service. Technical Report HPL-2000-4, HP Laboratories Palo Alto, January 2000. 2. Nokia White Paper: Quality of Experience (QoE) of mobile services: Can it be measured and improved? http://www.nokia.com/NOKIA_COM_1/Operato rs/Downloads/Nokia_Services/whitepaper_qoe_ net.pdf 3. D. Soldani, M. Li, and R. Cuny, eds. QoS and QoE Management in UMTS Cellular Systems. Wiley, 2006 Markus Fiedler: Quality of Experience and Quality Feedback

46

References 4. M. Fiedler, ed.: EuroNGI Deliverable D.WP.JRA.6.1.1. State-of-the-art with regards to user-perceived Quality of Service and quality feedback. May 2004. http://eurongi.enst.fr/archive/127/JRA611.pdf 5. M. Fiedler, ed.: EuroNGI Deliverable D.WP.JRA.6.1.3. Studies of quality feed-back mechanisms within EuroNGI. May 2005. http://eurongi.enst.fr/archive/127/JRA613.pdf 6. T. Hoßfeld, A. Binzenhöfer, M. Fiedler, and K. Tutschku: Measurement and Analysis of Skype VoIP Traffic in 3G UMTS Systems. Proc. of IPSMoMe 2006, Salzburg, Austria, Feb. 2006, pp 52—61 Markus Fiedler: Quality of Experience and Quality Feedback

47

CfP • WP.IA.8.6: First EuroNGI Workshop on SocioEconomic Impacts of NGI • DTU, Lyngby (Copenhagen), Danmark, Oct. 9— 10, 2006. • http://eurongi06.com.dtu.dk/ • Still accepting contributions (extended abstracts)

Markus Fiedler: Quality of Experience and Quality Feedback

48

Thank you for your interest ☺ Q&A

[email protected] Skype: mfibth

Markus Fiedler: Quality of Experience and Quality Feedback

49