A new camera system allows ecologists and filmmakers to produce videos that accurately replicate the colors that different animals see in natural settings, Vera Vasas at the University of Sussex, UK, and colleagues from the Hanley Color Lab at George Mason University, US, report in the open access journal PLOS Biology, publishing January 23rd.
To address these limitations, researchers developed a novel camera and software system that captures animal-view videos of moving objects under natural lighting conditions. The camera simultaneously records video in four color channels: blue, green, red and UV. This data can be processed into “perceptual units” to produce an accurate video of how those colors are perceived by animals, based on existing knowledge of the photoreceptors in their eyes. The team tested the system against a traditional method that uses spectrophotometry and found that the new system predicted perceived colors with an accuracy of over 92%.
This novel camera system will open new avenues of research for scientists, and allow filmmakers to produce dynamic, accurate depictions of how animals see the world around them, the authors say. The system is built from commercially available cameras, housed in a modular, 3D-printed casing, and the software is available open-source, allowing other researchers to use and build on the technology in the future.
Senior author Daniel Hanley adds, “We’ve long been fascinated by how animals see the world. Modern techniques in sensory ecology allow us to infer how static scenes might appear to an animal; however, animals often make crucial decisions on moving targets (e.g., detecting food items, evaluating a potential mate’s display, etc.). Here, we introduce hardware and software tools for ecologists and filmmakers that can capture and display animal-perceived colors in motion.”