Human-Centric Composite-Quality Modeling and Assessment for Virtual Desktop Clouds

Release Date:2013-03-27 Author:Yingxiao Xu, Prasad Calyam, David Welling, Saravanan Mohan, Alex Berryman, and Rajiv Ramnath Click:

[Abstract] There are several motivations, such as mobility, cost, and security, that are behind the trend of traditional desktop users transitioning to thin-client-based virtual desktop clouds (VDCs). Such a trend has led to the rising importance of human-centric performance modeling and assessment within user communities that are increasingly making use of desktop virtualization. In this paper, we present a novel reference architecture and its easily deployable implementation for modeling and assessing objective user quality of experience (QoE) in VDCs. This architecture eliminates the need for expensive, time-consuming subjective testing and incorporates finite-state machine representations for user workload generation. It also incorporates slow-motion benchmarking with deep-packet inspection of application task performance affected by QoS variations. In this way, a “composite-quality” metric model of user QoE can be derived. We show how this metric can be customized to a particular user group profile with different application sets and can be used to a) identify dominant performance indicators and troubleshoot bottlenecks and b) obtain both absolute and relative objective user QoE measurements needed for pertinent selection of thin-client encoding configurations in VDCs. We validate our composite-quality modeling and assessment methodology by using subjective and objective user QoE measurements in a real-world VDC called VDPilot, which uses RDP and PCoIP thin-client protocols. In our case study, actual users are present in virtual classrooms within a regional federated university system.

[Keywords] virtual desktops; quality modeling and assessment; performance benchmarking; thin-client protocol adaptation; objective QoE metrics