Flying 3D pixels let humans interact with virtual objects

A professor from Queen’s University’s Human Media Lab is about to reveal a new system that will change the way people interact with virtual reality.

Professor Roel Vertegaal and his students have developed an interactive swarm of flying 3D pixels (voxels) called BitDrones, that will allow users to explore virtual 3D information by interacting with physical self-levitating building blocks.

BitDrones in action. (Image Credit: Human Media Lab)
BitDrones in action. (Image Credit: Human Media Lab)

The professor and his students will unveil the BitDrones system on Monday, November 9 at the ACM Symposium on User Interface Software and Technology in Charlotte, North Carolina.

BitDrones uses swarms of nano quadcopters to bring humans a step closer to creating interactive self-levitating programmable matter that can change 3D shape  in a programmable fashion.

“BitDrones brings flying programmable matter, such as featured in the futuristic Disney movie Big Hero 6, closer to reality,” said Dr. Vertegaal. “It is a first step towards allowing people to interact with virtual 3D objects as real physical objects.”

A variety of BitDrones called ShapeDrones. (Image Credit: Human Media Lab)
A variety of BitDrones called ShapeDrones. (Image Credit: Human Media Lab)

The team created three types of BitDrones.  Each self-levitating display has a different resolution. “PixelDrones” are equipped with one LED and a small dot matrix display. “ShapeDrones” are augmented with a light-weight mesh and a 3D printed geometric frame, and serve as building blocks for complex 3D models. “DisplayDrones” are fitted with a curved flexible high resolution touchscreen, a forward-facing video camera and Android smartphone board.

All three of the BitDrones are equipped with reflective markers so they can be individually tracked and positioned in real time via motion capture technology. The system also tracks the user’s hand motion and touch, allowing users to manipulate the voxels in space.

“We call this a Real Reality interface rather than a Virtual Reality interface. This is what distinguishes it from technologies such as Microsoft HoloLens and the Oculus Rift: you can actually touch these pixels, and see them without a headset,” said Dr. Vertegaal.

The team highlights many possible applications for the new technology, including real-reality 3D modeling, gaming, molecular modeling, medical imaging, robotics and online information visualization.

For example, users could physically explore a file folder by touching the folder’s associated PixelDrone. When the folder opens, its contents are shown by other PixelDrones flying in a horizontal wheel below it. A user can browse the files by physically swiping drones to the left or right.

Users could also use ShapeDrones like building blocks for creating real-time 3D models. Another option users would have is using the BitDrone system for remote telepresence by allowing them to appear locally through a DisplayDrone with Skype.

The DisplayDrone would be capable of automatically tracking and replicating all of the remote user’s head movements so that he or she could virtually inspect a location.

Currently their system currently only supports dozens of comparatively large 2.5” – 5” sized drones, but the team is working to upgrade the system to support thousands of drones. These future drones would be about 0.5 inches each and allow users to render more seamless, high resolution programmable matter.

Comments are closed, but trackbacks and pingbacks are open.