Imagine transforming any ordinary space into a highly immersive and interactive display environment! This is what exactly SENSEable City Laboratory and ARES Lab (Aerospace Robotics and Embedded Systems Laboratory) aiming at MIT.
From the project website:
In its first implementation, the Flyfire project sets out to explore the capabilities of this display system by using a large number of self-organizing micro helicopters. Each helicopter contains small LEDs and acts as a smart pixel. Through precisely controlled movements, the helicopters perform elaborate and synchronized motions and form an elastic display surface for any desired scenario.
With the self-stabilizing and precise controlling technology from the ARES Lab, the motion of the pixels is adaptable in real time. The Flyfire canvas can transform itself from one shape to another or morph a two-dimensional photographic image into an articulated shape. The pixels are physically engaged in transitioning images from one state to another, which allows the Flyfire canvas to demonstrate a spatially animated viewing experience.
Visit Flyfire project website to learn more about it.