Flying pixels create self-levitating displays

A team of researchers at Queen’s University’s Human Media Lab have unveiled a system of flying ‘BitDrones’ that let users explore virtual information by interacting with levitating voxels. The system, called BitDrones, allows users to explore virtual 3D information by interacting with physical self-levitating building blocks. BitDrones is the first step towards creating interactive self-levitating programmable matter, materials capable of changing their 3D shape in a programmable fashion using swarms of nano quadcopters. The work highlights many possible applications for the new technology, including real-reality 3D modeling, gaming, molecular modeling, medical imaging, robotics and online information visualization.

Roel Vertegaal and his students, who created the technology, say that it’s the first step towards creating interactive, self-levitating objects that can change their shape in a programmable way, using swarms of small quadcopters. “BitDrones brings flying programmable matter closer to reality,” said Vertegaal. “It is a first step towards allowing people to interact with virtual 3D objects as real physical objects.” Right now, the system consists of three types of drone. “PixelDrones” are equipped with an LED and a small dot-matrix display. “ShapeDrones” have a lightweight mesh and 3D printed frame, and serve as building blocks for 3D models. Finally, “DisplayDrones” have a high-resolution touchscreen and forward-facing video camera, powered by an Android smartphone. All three have reflective markers, letting their positions be tracked in real-time with motion capture technology.

‘You can actually touch these pixels’. “We call this a Real Reality interface rather than a Virtual Reality interface,” says Vertegaal. “This is what distinguishes it from technologies such as Microsoft HoloLens and the Oculus Rift, you can actually touch these pixels, and see them without a headset.” Right now, the system consists of dozens of drones sized between 5 and 10 centimetres, but the team is working to scale that up to thousands of drones less than a centimetre in size. Eventually, Vertegaal hopes, such a system could let us navigate digital systems, recreate digital 3D objects in the real world, and even allow for telepresence applications where the movements of someone’s head are tracked and replicated to allow someone to remotely inspect a location.

Dr. Vertegaal and his team describe a number of possible applications for this technology. In one scenario, users could physically explore a file folder by touching the folder’s associated PixelDrone. When the folder opens, its contents are shown by other PixelDrones flying in a horizontal wheel below it. Files in this wheel are browsed by physically swiping drones to the left or right. Users would also be able to manipulate ShapeDrones to serve as building blocks for a real-time 3D model. Finally, the BitDrone system will allow for remote telepresence by allowing users to appear locally through a DisplayDrone with Skype. The DisplayDrone would be capable of automatically tracking and replicating all of the remote user’s head movements, allowing a remote user to virtually inspect a location and making it easier for the local user to understand the remote user’s actions.

While their system currently only supports dozens of comparatively large 2.5” – 5” sized drones, the team at the Human Media Lab are working to scale up their system to support thousands of drones. These future drones would measure no more than a half inch in size, allowing users to render more seamless, high resolution programmable matter.

 

For more information please visit: www.queensu.ca

Flying-pixels-create-self-levitating-displays