To promote ASICS running shoes during the New York Marathon, members of the technical team in Zooloop, hired by Inwindow Outdoor LLC, participated in the development of this experience
that allowed its users to interact with a group of silhouettes of “virtual runners”. A group of said runners was created and then projected on top of a 60 feet surface, moving along with the passersby. The virtual runners were masked using the silhouette of every person passing in front of the installation, so when they moved, new parts of the virtual scene were shown. As an additional interactive element when a person stood in front of the screen, they were shown videos related to the marathon and the values promoted by the brand, and also written messages that appeared to come from the virtual runners.
Using a 60 feet screen
This challenge was twofold. First it was necessary to generate a virtual work area with enough resolution to display every element in the scene. Second, the team had to integrate a hardware system that was able to project the complete image, without cuts or misalignments.
The solution to the first point was achieved using the multiple video output control software AMD Eyefinity. Using Eyefinity, it was possible to work with a resolution of 7680 x 1080 pixels. Once the massive scene was built, a synchronized system of multiple digital projectors was used to project the installation on the complete width of the screen.
Guarantee real time execution
Due to the big size of the scene and the need to use multiple high resolution videos, optimal performance was a very important factor, that if not achieved, would lead to losing the immersive feeling that the experience was following its users in real time. To solve this, processing algorithms -for both the video render and the depth camera processing- were developed to run on the GPU to allow for parallel processing, reducing load on the processors, and allowing for lower response times.
Capture several interactions simultaneously
With an installation this big, it was certain that several people would be interacting with it at any given time. To accommodate that requirement, a total of four Kinect sensors were used simultaneously, and its data combined in a single image, in order to generate the silhouettes and allow multiple simultaneous interactions with the application.
Videos by Inwindow Outdoor
Sensors – Kinect. To capture the movements and interactions of users of the installation.
Parallel Processing using the GPU. To distribute in an efficient manner the huge computacional load due to the complexity and size of the resources, and also due to the multiple simultaneous interactions by the users.
The installation, located in a metro station in New York, attracted an unprecedented number of interactions, increasing awareness of the event in the mind of the passersby. The experience was part of the strategy that potentially incremented the number of runners in the marathon.