Microsoft Research recently introduced Virtual Robot Overlay for Online Meetings (VROOM), a way to combine AR and VR to bring life-sized avatars into the workplace in the form of telepresence robots. Makers of VROOM detailed the system in a recently released paper and said it’s meant to make the person working remotely in VR and the person working in the office and wearing a HoloLens AR headset feel like they’re in the same place.
As a Windows Mixed Reality headset tracks the remote worker’s pose and head movement, they get a 360-degree view of the surroundings and any movements via the telepresence robot. A Unity app is used to animate the avatar for the person wearing the HoloLens based on the movement of the remote worker.
Hand gestures and arm movements are recorded by controllers and seen by both participants. The system adds mouth movement when people are talking, as well as blinking and idle movements to make the avatar seem more lifelike. VROOM converts a 2D image into a 3D avatar to place the remote worker’s face on the avatar’s head.
The VROOM system also gives the VR user a first-person view so the remote worker can see their hand movements and gestures. The avatar then appears to walk when the remote worker instructs the robot to move.
VROOM was written by Simon Fraser University Ph.D. candidate and Microsoft Research intern Brennan Jones, Microsoft Research engineer Yaying Zhang, Microsoft Research social communication tech senior researcher Sean Rintel, and Microsoft researcher Priscilla Wong.
“Although we used a telepresence robot with a screen so that we could run a comparison between standard robotic telepresence and VROOM (to be reported in a future paper), the screen would be unnecessary in a space where all local users wore a headset. Thus, a future iteration could use any drivable robot with a 360-degree camera on a pole reaching head-height,” the authors said in the paper.
The authors expect that tech like VROOM may be best applied to industries where work involves whiteboard or design sessions so you can follow gaze and attention and participate on an equal level. Other documented uses of telepresence robots include applications in museums, for remote academic conference attendance, and in long-distance relationships.
VROOM demonstrates only one-on-one interactions. Future work with VROOM may explore applications with multiple people, experiments in shared mixed reality workspaces, or work with less expensive mobile robots.
VROOM was accepted for publication at the ACM CHI Conference on Human Factors in Computing System as a sequel to a prior research contribution. The conference was scheduled to take place in Hawaii last week but was largely canceled due to COVID-19 and global shutdown orders. Previous ACM CHI work published in 2018 introduced Mini-Me, a way to bring avatars into meetings and interactions in mixed reality.
This post by Khari Johnson originally appeared on Venturebeat.