Home Page Icon
Home Page
Table of Contents for
Multimedia Quality of Experience (QoE)
Close
Multimedia Quality of Experience (QoE)
by Luigi Atzori, Tasos Dagiuklas, Periklis Chatzimisios, Chang Wen Chen
Multimedia Quality of Experience (QoE)
About the Editors
List of Contributors
Preface
1 Introduction
2 QoE−Defining a User-Centric Concept for Service Quality
2.1 Introduction
2.2 Definitions of QoE
2.3 Differences Between QoE and QoS
2.4 Factors Influencing QoE
2.5 Service QoE
2.6 Human Factors and QoE
2.7 The Role of QoE in Communication Ecosystems
2.8 Conclusions
Acknowledgments
Notes
References
Acronyms
3 Review of Existing Objective QoE Methodologies
3.1 Overview
3.2 Quality Metric Taxonomy
3.3 Basic Computational Modules for Perceptual Quality Metrics
3.4 Quality Metrics for Images
3.5 Quality Metrics for Video
3.6 Quality Metrics for Audio/Speech
3.7 Joint Audiovisual Quality Metrics
3.8 Concluding Remarks
References
Acronyms
4 Quality of Experience for HTTP Adaptive Streaming Services
4.1 Introduction
4.2 HAS Concepts and Standardization Overview
4.3 QoE in 3GPP DASH
4.4 Link-Aware Adaptive Streaming
4.5 Video-Aware Radio Resource Allocation
4.6 DASH over e-MBMS
4.7 Server–Client Signaling Interface Enhancements for DASH
4.8 Conclusion
Notes
References
Acronyms
5 No-Reference Approaches to Image and Video Quality Assessment
5.1 Introduction
5.2 No-Reference Quality Assessment
5.3 Image and Video Quality Databases
5.4 Performance Evaluation
5.5 Applications
5.6 Challenges and Future Directions
5.7 Conclusion
References
Acronyms
6 QoE Subjective and Objective Evaluation Methodologies
6.1 Human Visual Perception and QoE Assessment
6.2 Models and Approaches to QoE Assessment
6.3 Offline and Online Evaluation
6.4 Remarks
Acknowledgments
Notes
References
Acronyms
7 QoE Control, Monitoring, and Management Strategies
7.1 Introduction
7.2 QoE Monitoring
7.3 QoE Management and Control
7.4 Conclusion
Acknowledgment
References
Further Reading
Acronyms
8 Conclusions
Index
EULA
Search in book...
Toggle Font Controls
Playlists
Add To
Create new playlist
Name your new playlist
Playlist description (optional)
Cancel
Create playlist
Sign In
Email address
Password
Forgot Password?
Create account
Login
or
Continue with Facebook
Continue with Google
Sign Up
Full Name
Email address
Confirm Email Address
Password
Login
Create account
or
Continue with Facebook
Continue with Google
Prev
Previous Chapter
Multimedia Quality of Experience (QoE)
Next
Next Chapter
About the Editors
Contents
About the Editors
List of Contributors
Preface
1 Introduction
2 QoE−Defining a User-Centric Concept for Service Quality
2.1 Introduction
2.2 Definitions of QoE
2.3 Differences Between QoE and QoS
2.4 Factors Influencing QoE
2.5 Service QoE
2.6 Human Factors and QoE
2.7 The Role of QoE in Communication Ecosystems
2.8 Conclusions
Acknowledgments
Notes
References
Acronyms
3 Review of Existing Objective QoE Methodologies
3.1 Overview
3.2 Quality Metric Taxonomy
3.3 Basic Computational Modules for Perceptual Quality Metrics
3.4 Quality Metrics for Images
3.5 Quality Metrics for Video
3.6 Quality Metrics for Audio/Speech
3.7 Joint Audiovisual Quality Metrics
3.8 Concluding Remarks
References
Acronyms
4 Quality of Experience for HTTP Adaptive Streaming Services
4.1 Introduction
4.2 HAS Concepts and Standardization Overview
4.3 QoE in 3GPP DASH
4.4 Link-Aware Adaptive Streaming
4.5 Video-Aware Radio Resource Allocation
4.6 DASH over e-MBMS
4.7 Server–Client Signaling Interface Enhancements for DASH
4.8 Conclusion
Notes
References
Acronyms
5 No-Reference Approaches to Image and Video Quality Assessment
5.1 Introduction
5.2 No-Reference Quality Assessment
5.3 Image and Video Quality Databases
5.4 Performance Evaluation
5.5 Applications
5.6 Challenges and Future Directions
5.7 Conclusion
References
Acronyms
6 QoE Subjective and Objective Evaluation Methodologies
6.1 Human Visual Perception and QoE Assessment
6.2 Models and Approaches to QoE Assessment
6.3 Offline and Online Evaluation
6.4 Remarks
Acknowledgments
Notes
References
Acronyms
7 QoE Control, Monitoring, and Management Strategies
7.1 Introduction
7.2 QoE Monitoring
7.3 QoE Management and Control
7.4 Conclusion
Acknowledgment
References
Further Reading
Acronyms
8 Conclusions
Index
EULA
List of Tables
Chapter 4
Table 4.1
Table 4.2
Table 4.3
Chapter 5
Table 5.1
Table 5.2
Chapter 6
Table 6.1
List of Illustrations
Chapter 2
Figure 2.1
The layered QoE model proposed in [11]. QoE is about the user, and hence cannot be considered absent the physiological, psychological, social (“Human”), and role-related (“User”) aspects of the user as a person, whereas QoS concerns the system, and hence it is considered at the Resource and Application layers. Incidentally, these layers can be mapped to the OSI network model
Figure 2.2
Multi-dimensional view of QoE influence factors
Figure 2.3
The triangle model for ecosystem quality
Figure 2.4
Model for QoE-based charging.
Chapter 3
Figure 3.1
The quality metric taxonomy
Chapter 4
Figure 4.1
HAS framework between the client and web/media server
Figure 4.2
DASH MPD hierarchical data model
Figure 4.3
MPEG DASH profiles
Figure 4.4
QoE metrics and reporting framework for 3GPP DASH and progressive download
Figure 4.5
Adaptive streaming client player states
Figure 4.6
Startup delay and startup quality comparison for PLA and PLU approaches
Figure 4.7
Rebuffering and average quality comparison for PLA and PLU approaches
Figure 4.8
Rebuffering and average quality comparison for RAGA with different scheduling approaches
Figure 4.9
Transport-layer processing overview
Figure 4.10
Markov model for simulating LTE RLC-PDU losses
Figure 4.11
Startup delay as a function of
K
min
Figure 4.12
Performance comparisons for
K
/
N
= 0.8, 0.9: average PSNR
Figure 4.13
Performance comparisons for
K
/
N
= 0.8, 0.9: rebuffering percentage
Figure 4.14
Download rate (throughput) for four clients streaming content from the same server and network
Chapter 5
Figure 5.1
Blind quality assessment models requiring different amounts of prior information
Chapter 6
Figure 6.1
A color image sequence example: (a) image sequence; (b) three-level DWT decomposition sub-band designations; (c) three-color channel signals in
RGB
space; (d) three-color channel signals in
YC
B
C
R
space
Figure 6.2
Examples of chroma distortions in
Barbara
test image due to chroma sub-sampling: (a) the uncompressed image in component
YC
B
C
R
4:4:4 format; (b) the image in component 4:2:2 format; (c) the image in component 4:2:0 format; (d) the image in component 4:1:1 format; (e) contrast-enhanced difference image between (a) and (b); (f) contrast-enhanced difference image between (a) and (c); (g) contrast-enhanced difference image between (a) and (d). Contrast enhancement was applied to difference images with a bias of 128 shown in (e)–(g) for better visualization in PDF format or using photo-quality printing
Figure 6.3
R-D optimization considering a perceptual distortion measure [3] or a utility score [4] for QoE-regulated services compared with the MSE.
Figure 6.4
An information-theoretic framework used by VIF measurement (after [58]), where
is the GSM, an RF, as the NSS model in the wavelet domain, approximating the reference image,
C
k
and
U
k
are
M
-dimensional vectors consisting of non-overlapping blocks of
M
coefficients in a given sub-band,
a Gaussian vector RF with zero mean and covariance
,
an RF of positive scalars, symbol “⊙” defines the element-by-element product of two RFs [58], and
is the set of location indices in the wavelet decomposition domain;
, the RF representing the distorted image in the same sub-band,
a deterministic scalar field,
a stationary additive zero-mean Gaussian noise RF with variance
which is white and independent of
with identity matrix
I
and
;
and
modeling HVS visual distortions to the reference
and channel/coding distortion
, respectively, with RFs
and
being zero-mean uncorrelated multivariate Gaussian of
M
dimensions with covariance
and
the variance of the visual noise;
b
is the sub-band index and
the selected sub-band critical for VIF computation
Figure 6.5
Multichannel contrast gain control model.
Source:
Tan
et al.
, 2014 [82]. Reproduced with permission of Dr. Tan
Chapter 7
Figure 7.1
Reduced-reference quality metric example.
Source:
Martini
et al.
, 2012 [12] and Martini
et al.
, 2012 [13]. Reproduced with permission of Springer and Elsevier
Figure 7.2
Application controller (left) and base station controller (right) proposed in the OPTIMIX European project
Figure 7.3
Wireless delivery to multiple users – example scenario
Guide
Cover
Table of Contents
Preface
Pages
ix
x
xi
xii
xiii
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
123
124
125
126
128
129
130
131
132
133
134
135
136
137
138
139
140
141
143
144
145
146
147
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
169
170
171
173
174
Add Highlight
No Comment
..................Content has been hidden....................
You can't read the all page of ebook, please click
here
login for view all page.
Day Mode
Cloud Mode
Night Mode
Reset