Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3386290.3396938acmconferencesArticle/Chapter ViewAbstractPublication PagesmmsysConference Proceedingsconference-collections
research-article

What you see is what you get: measure ABR video streaming QoE via on-device screen recording

Published: 08 June 2020 Publication History

Abstract

Analyzing delivered QoE for Adaptive Bitrate (ABR) streaming over cellular networks is critical for a host of entities including content providers and mobile network providers. However, existing approaches mostly rely on network traffic analysis. In addition to potential accuracy issues, they are challenged by the increasing use of end-to-end network traffic encryption. In this paper, we explore a very different approach to QoE measurement --- utilizing the screen recording capability widely available on commodity devices to record the video displayed on the mobile device screen, and analyzing the recorded video to measure the delivered QoE. We design a novel system VideoEye to conduct such screen-recording-based QoE analysis. We identify the various technical challenges involved, including distortions introduced by the screen recording process that can make such analysis difficult. We develop techniques to accurately measure video QoE from the screen recordings even in the presence of recording distortions. Our evaluations demonstrate that VideoEye accurately detects important QoE indicators including the track played at different points in time, and stall statistics. The maximal error in detected stall duration is 0.5 s. The accuracy of detecting the displayed tracks is higher than 97%.

References

[1]
Cisco Visual Networking Index: Forecast and Methodology, 2016-2021. https://www.cisco.com/c/en/us/solutions/collateral/service-provider/visual-networking-index-vni/mobile-white-paper-c11-520862.html, 2017.
[2]
GL core product catalog: drive testing. https://www.gl.com/drive-testing.html.
[3]
Rohde & Schwarz Mobile Network Testing - video quality test solution. https://www.rohde-schwarz.com/ae/solutions/test-and-measurement/mobile-network-testing/expertise/video-quality-testing/video-quality-testing_232054.html.
[4]
TEMS Portfolio: Where superior mobile network experiences begin. https://www.infovista.com/products/tems-portfolio.
[5]
Saamer Akhshabi, Ali C Begen, and Constantine Dovrolis. An experimental evaluation of rate-adaptation algorithms in adaptive streaming over HTTP. In Proceedings of the second annual ACM conference on Multimedia systems, pages 157--168. ACM, 2011.
[6]
Giorgos Dimopoulos, Ilias Leontiadis, Pere Barlet-Ros, and Konstantina Papagiannaki. Measuring video QoE from encrypted traffic. In Proceedings of the 2016 Internet Measurement Conference, pages 513--526. ACM, 2016.
[7]
Shichang Xu, Subhabrata Sen, Z Morley Mao, and Yunhan Jia. Dissecting vod services for cellular: performance, root causes and best practices. In Proceedings of the 2017 Internet Measurement Conference, pages 220--234. ACM, 2017.
[8]
Ahmed Mansy, Mostafa Ammar, Jaideep Chandrashekar, and Anmol Sheth. Characterizing client behavior of commercial mobile video streaming services. In Proceedings of Workshop on Mobile Video Delivery, page 8. ACM, 2014.
[9]
Irena Orsolic, Dario Pevec, Mirko Suznjevic, and Lea Skorin-Kapov. Youtube qoe estimation based on the analysis of encrypted network traffic using machine learning. In Globecom Workshops (GC Wkshps), 2016 IEEE, pages 1--6. IEEE, 2016.
[10]
Irena Orsolic, Dario Pevec, Mirko Suznjevic, and Lea Skorin-Kapov. A machine learning approach to classifying YouTube QoE based on encrypted network traffic. Multimedia tools and applications, 76(21):22267--22301, 2017.
[11]
M Hammad Mazhar and Zubair Shafiq. Real-time video quality of experience monitoring for HTTPS and QUIC. In INFOCOM 2018-IEEE Conference on Computer Communications, IEEE. IEEE, 2018.
[12]
IETF QUIC working group. https://datatracker.ietf.org/wg/quic, 2018.
[13]
google/ExoPlayer: An extensible media player for Android. https://github.com/google/exoplayer.
[14]
Shichang Xu, Subhabrata Sen, and Z Morley Mao. Csi: inferring mobile abr video adaptation behavior under https and quic. In Proceedings of the Fifteenth European Conference on Computer Systems, pages 1--16, 2020.
[15]
Te-Yuan Huang, Nikhil Handigol, Brandon Heller, Nick McKeown, and Ramesh Johari. Confused, timid, and unstable: picking a video streaming rate is hard. In Proceedings of the 2012 ACM conference on Internet measurement conference, pages 225--238. ACM, 2012.
[16]
Wubin Pan, Gaung Cheng, Hua Wu, and Yongning Tang. Towards qoe assessment of encrypted youtube adaptive video streaming in mobile networks. In 2016 IEEE/ACM 24th International Symposium on Quality of Service (IWQoS), pages 1--6. IEEE, 2016.
[17]
Paul Schmitt, Francesco Bronzino, Sara Ayoubi, Guilherme Martins, Renata Teixeira, and Nick Feamster. Inferring streaming video quality from encrypted traffic: Practical models and deployment experience. Proceedings of the ACM on Measurement and Analysis of Computing Systems, 2020.
[18]
youtube-dl: Download videos from YouTube. https://rg3.github.io/youtube-dl/.
[19]
FBDOWN.net. https://www.fbdown.net/.
[20]
Man-in-the-middle attack. https://en.wikipedia.org/wiki/Man-in-the-middle_attack.
[21]
Yang Xu, Chenguang Yu, Jingjiang Li, and Yong Liu. Video telephony for end-consumers: measurement study of google+, ichat, and skype. In Proceedings of the 2012 Internet Measurement Conference, pages 371--384. ACM, 2012.
[22]
DeckLink capture and playback cards. https://www.blackmagicdesign.com/products/decklink.
[23]
Yao Liu, Sujit Dey, Fatih Ulupinar, Michael Luby, and Yinian Mao. Deriving and validating user experience model for dash video streaming. IEEE Transactions on Broadcasting, 61(4):651--665, 2015.
[24]
Mallesham Dasari, Santiago Vargas, Arani Bhattacharya, Aruna Balasubramanian, Samir R Das, and Michael Ferdman. Impact of device performance on mobile internet qoe. In Proceedings of the Internet Measurement Conference 2018, pages 1--7. ACM, 2018.
[25]
Big buck bunny. https://peach.blender.org/.
[26]
Toward A Practical Perceptual Video Quality Metric. https://medium.com/netflix-techblog/toward-a-practical-perceptual-video-quality-metric-653f208b9652, 2016.
[27]
How to Choose and Use Objective Video Quality Benchmarks. http://www.streamingmedia.com/Articles/ReadArticle.aspx?ArticleID=122050&PageNum=2.
[28]
FFmpeg. https://www.ffmpeg.org/.
[29]
Fourier Transform. https://homepages.inf.ed.ac.uk/rbf/HIPR2/fourier.htm.
[30]
Longest increasing subsequence. https://en.wikipedia.org/wiki/Longest_increasing_subsequence.
[31]
MP4Box | GPAC. https://gpac.wp.imt.fr/mp4box/.
[32]
tc(8) - Linux man page. https://linux.die.net/man/8/tc.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
NOSSDAV '20: Proceedings of the 30th ACM Workshop on Network and Operating Systems Support for Digital Audio and Video
June 2020
73 pages
ISBN:9781450379458
DOI:10.1145/3386290
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 June 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. QoE measurement
  2. adaptive bitrate
  3. screen recording
  4. video streaming

Qualifiers

  • Research-article

Funding Sources

  • NSF

Conference

MMSys '20
Sponsor:
MMSys '20: 11th ACM Multimedia Systems Conference
June 10 - 11, 2020
Istanbul, Turkey

Acceptance Rates

NOSSDAV '20 Paper Acceptance Rate 10 of 22 submissions, 45%;
Overall Acceptance Rate 118 of 363 submissions, 33%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)46
  • Downloads (Last 6 weeks)1
Reflects downloads up to 01 Nov 2024

Other Metrics

Citations

Cited By

View all

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media