From the ground, the green and white lights hovering above the city of Nottingham probably resembled a distant storm, but from the window of a Cessna 172 aircraft, the shape of a man on horseback could clearly be seen galloping across the darkened troposphere.
This green night rider is the result of three years of hard graft by artist Dave Lynch, scientist Mike Nix and maker Aaron Nielsen, pushing the boundaries of art and science.
Together they pulled off a world first in June when they managed to project moving images directly onto clouds from an aircraft.
And while fighting the elements, failed kit and lack of cash in their quest to see the rider in the clouds – a work they call Project Nimbus – they’ve discovered the real importance of collaboration.
Project Nimbus used a laser version of the zoopraxiscope, a device designed by pioneering 19th-century photographer Eadweard Muybridge. So it was only right that they should also use his famous image a galloping horse, Horse in Motion, for the display.
“It was amazing,” said Lynch, who spent hours searching for the “right type of cloud” as he shot the video and Nix operated the zoopraxiscope. “After an hour of flying and almost giving up, we had come up above a cloud layer into peaks, swirls and canyons stretching out like an ocean, giving us the conditions we never thought we would see,” he says.
Gods of war
The journey started in 2007 when Lynch was studying for his master’s degree and came across a military paper entitleddetailing work on weapons since the Vietnam War. One of them involved projecting an ancient god onto the clouds over an enemy city (whose public communications had been seized) in order to terrify the citizens.
Lynch was inspired by the idea, but soon found his early experiments in 2007 with a converted 16-millimetre cine projector with a laser light source impractical due to weight and power requirements.
After experimenting with projecting moving image loops of swimming dolphins from vehicles, he came across Muybridge’s work and ended up projecting the famous horse onto the streets of Leeds. By 2012, funding from the AND festival and the arts incubator Octopus Collective kickstarted three years of research into developing a modern zoopraxiscope.
In the 1870s, Muybridge was commissioned by a rich racehorse owner to study animal motion. He captured photographs of horses running on a track using multiple cameras and fast shutter speeds, and put them into discs to project from his zoopraxiscope. The iconic moving images that resulted, Sallie Gardner at a Gallop or The Horse in Motion, clearly demonstrated that the galloping horse takes all four hooves off the ground at the same time.
Lynch realised that for Project Nimbus only lasers could give the sharp image he needed, especially projecting onto the very tricky medium of clouds. His first zoopraxiscope was a lashed–together affair, made from bits of recycled technology, and a $35, 2-watt blue laser bought on eBay.
Lynch asked for help about laser safety and getting shaper images from Nix and Ben Whitaker, both in the chemistry department at the University of Leeds, and he was surprised to learn that Muybridge had inspired work in their field, too.
“It turns out that the field of ultrafast laser spectroscopy, which aims to ‘freeze-frame’ molecular motion, also draws analogy from Muybridge. Nobel prize winner Ahmed Zewail even referred to the same horse projection we used in his prize acceptance speech,” says Nix.
The challenge was to work out how a zoopraxiscope’s projection mechanism worked in real terms. It took a conversation with Stephen Herbert, a specialist in early cinema who had worked on replica zoopraxiscopes, to show that the image discs and shutter slits needed to rotate in sync and in opposite directions.
The original Muybridge slit method blocked most of the light from the laser, resulting in a very dim image that would be impossible to see on a cloud, even at night.
The breakthrough was to replace the 14 slits in the zoopraxiscope’s shutter disc with 14 hemispherical lenses, replicating the slits by turning the circular laser beam into focused lines of light which sweep the image in place of the shutter.
The result was a sharp, bright image – even when projected onto the nebulous media of clouds at variable distances from the projector – in this case up to 50 metres away.
“Historical analogue technology was the only safe way to use a laser in the sky, as other laser projectors work by scanning an image using a dangerous pencil beam,” says Nix.
“To see the work purely as the spectacle of a horse a mile high on the clouds obscures the greater success of genuine collaboration,” Nix continued. Project Nimbus was involved in the formation of, which is exploring future collaboration between art, science and the maker community.
Lynch will be reviewing the three-year project for the Foundation for Art and Creative Technology. He will be talking about his work atthis Saturday, 4 July.
As for the future, Lynch says: “We’d love to collaborate with someone like flight pioneer Richard Branson to develop a digital art piece which allows us to interact and experience the world through cloud projections.”
If you would like to reuse any content from New Scientist, either in print or online, pleasedepartment first for permission. New Scientist does not own rights to photos, but there are a available for use of articles and graphics we own the copyright to.
This entry passed through the Full-Text RSS service – if this is your content and you’re reading it on someone else’s site, please read the FAQ at fivefilters.org/content-only/faq.php#publishers.