Researchers seek to enhance real-time mobile AR games with novel edge-native approach
A new research paper from Carnegie Mellon’s Living Edge Lab explores the future of synchronous, multiplayer mobile Augmented Reality (mAR) gaming through their game, dubbed V-Light. The use of edge computing for enhancing gameplay is not new; what the paper is proposing, however, is a novel framework and development tool for edge-native games that also incorporates advanced body motion sensing to enhance player interaction.
Augmented reality (AR), a metaverse staple, is designed to enhance and add to reality. In other words, AR enables users to view the real world with additional interactive digital elements. An AR app provides interactive information using visual, auditory and sensory features in real-world environments to enhance the user experience.
The paper, co-authored by Jim Blakley, the associate director of Living Edge Lab; Jessica Hammer of the CMU Entertainment Technology Center; and Bob Gazda of InterDigital, says mobile AR platforms present game designers with various challenges due to technical limitations, including memory constraints, processing power, battery life and latency for cloud-based solutions.
As a result, real-time interaction is not always possible and companies have had to limit their mAR experiences to off-screen, asynchronous multiplayer and low-fidelity visuals. Despite these challenges, the authors note that mAR still holds great potential for game designers, as evidenced by the success of games like Ingress and Pokémon Go.
In the paper, the team explores how edge computing can be used to improve mobile AR games like V-Light, which require real-time responses from players.
As opposed to cloud-based processing, edge computing brings computational resources closer to the user and decreases latency while keeping bandwidth high. These qualities make it an ideal technology for improving mobile AR games, the paper authors say. To that end, V-Light provides a detailed toolkit to help developers design games optimized for edge computing.
In addition, the team proposes a shared world model for mobile AR games, which can be used to prototype and test the designs. The paper also explores current mobile AR limitations and provides a framework for addressing them with edge computing.
The paper also offers insight into the design of mAR games and other mAR applications, providing information about how to work with body motion, multi-party relationships and portability. Additionally, it offers a prototyping tool to game designers that can help them create games that current systems could not otherwise support. In other words, the research could have significant implications for the design and development of future mobile AR games that are edge-native.
Nozomi Networks joins AWS ISV Accelerate program to offer cloud-based OT and IoT security
application development | AR/VR | Carnegie Mellon University | edge native | InterDigital | Living Edge Lab | metaverse | mobile gaming | motion sensing | research