Controller-Free Point Cloud VR
Qilmeg Doudatcz, A. Alfred Taubman College of Architecture and Urban Planning
Collaborators: Mardy Hillengas, Taubman; Junde Song, LSA (Computer Science); Gardi Badruun, founder of Ger Hub, offering help for photogrammetry collection
Faculty Advisor: Dawn Gilpin, Taubman
Controller-Free Point Cloud VR has evolved from our ongoing research within the Empathy in Point Clouds (EIPC) project. EIPC represents a significant interdisciplinary undertaking, necessitating close collaboration among faculty members and students. Central to this endeavor is the LiDAR scan point cloud model, distinguished by its remarkable capacity to immortalize transient moments and culturally significant spaces, rendering them into enduring and captivating experiences.
Currently, EIPC’s primary offerings encompass video presentations set along curated navigation paths and VR projects developed in Unreal Engine, all contingent upon tethered headset connectivity for navigation. Our paramount objective is to heighten the project’s accessibility and interactivity while transforming it into a seamless and cordless VR experience.
We are proposing two distinct methods to facilitate collaborative interaction with the point clouds. The first method involves crafting a multiplayer VR experience directly within the headsets. The second method leverages depth cameras to facilitate interaction with point clouds, with the output subsequently visualized through projectors, accommodating more extensive group interactions.
Our research project naturally encompasses three primary components. Our initial emphasis revolves around the implementation of hand tracking, a technology designed to elevate the overall user experience. To streamline the collaboration and interaction within Meta Quest VR headsets, we plan to integrate the Oculus Interaction SDK, which includes scripts for tracking hand gestures and movements. This approach significantly reduces friction during navigation and interaction.
Moreover, we’re exploring Kinect kits within Touch Designer, which represent a viable solution for including movement tracking, making it particularly advantageous for larger group meetings and point cloud navigation facilitated through projectors. During this stage, we will also gather photos of culturally significant objects from individuals in Inner Mongolia and Mongolia in collaboration with Ger Hub. These photos will serve as the source material for the photogrammetry model that we will use within the virtual space.
The second facet of our research delves into the development of multiplayer VR collaboration directly within the headsets. Both Unity and Unreal Engine offer dedicated multiplayer project templates for VR development. This strategy affords participants an immersive meeting experience within the point clouds and extends the potential for remote collaboration, thereby fostering a more inclusive and interactive environment.
Lastly, our research delves into the realm of projector mapping, a pivotal component of our solution. Once we’ve successfully integrated hand tracking and enabled interaction with the point clouds, our focus shifts towards the visual aspect of this interaction process. We’re keen to explore how the data points can be projected, not only onto flat surfaces but also skillfully mapped within Touch Designer to facilitate projection onto topographical site models. This dynamic approach enhances the provision of site context information, thereby elevating the immersive experience.
In summation, Controller-Free Point Cloud VR embodies a collaborative endeavor aimed at the transformation of cultural and historical spaces into immersive, enduring experiences. Our comprehensive approach includes implementing hand tracking, the development of multiplayer VR collaboration, and the exploration of projector mapping, collectively designed to ensure that our audience can interact effortlessly with the point clouds. These ambitious objectives underpin our commitment to creating a more accessible and engaging VR experience, harmonizing with the goals and criteria of the grant.