Following three research papers show that the key bottleneck in un-tethered VR experience on mobile devices is Rendering a Frame. And here is how they try to tackle this problem. These three papers are a nice series of work on practical VR experience on mobile devices.
FlashBack is one of the first papers that studies the impact of running the VR applications on mobile devices. The authors say that if frame rendering is very compute intensive and the mobile cheap hardware does not meet this compute requirements. Instead, they argue that there is lot of storage often underutilized on mobile devices, and this storage can be used to memoize the pre-rendered images. They show the effectiveness of their method by pre-rendering all the possible orientations of images for Viking Village VR application offline and by storing in three tiered hierarchical cache (GPU memory, RAM, SSD).
Comments: Very specific to one VR application. Does not scale well.
Furion is a fantastic paper that investigates how the current mobile devices can satisfy the rendering requirements of VR applications, in cooperation with a remote rendering server. The authors also argue that future mobile hardware alone is not going to satisfy the rendering compute requirements because the mobile device enhancements are almost saturated. Further, they show that the wireless networks are not capable if all the rendering is placed on a remote server and stream the raw frame data over the wireless link (A raw frame requires a data rate of tens of Gbps network which is way far from the existing and upcoming future future wireless technologies such as 802.11ad, ay or ax).
Instead, Furion characterizes different VR applications and figures out a way to split the rendering into foreground and background activities. They show that the foreground activities are light weight in load and are not predictable, and hence they can be placed at the mobile device. Whereas the background activities are heavy weight in rendering load and predictable, and hence they can be placed at a remote server. The remote server also does some sort of compression to optimize the data over wireless links, and the mobile device decodes the data and displays.
Cutting the Cord: Mobisys’2018
This paper follows Furion by offloading complete rendering computation to a remote server. The authors argue that the mobile device display rate should be in sync with the remote server so as to avoid frames missing and unnecessary frame display delays. They propose a parallel rendering and streaming pipeline where the remote server first renders the left eye image of VR application and starts streaming (encode, transfer, and decode) it while simultaneously rendering the right eye image. Further, it proposes remote Vsync driven rendering approach to synchronize the remote server frame rendering and mobile device display rate.