Zoomorph - An alternate realities app simulating how animals see color

Developed with the help of color vision scientists, using scientific data, Zoomorph simulates how 50 different species of animals see colors.

Scientists learn about color vision in animals (human and non-human) by studying the physiology of the eye, specifically color receptors, and by doing behavioral studies. The data produced shows what an animal has the potential to see - how an animal actually experiences colors is difficult to determine, but Zoomorph might help you imagine those experiences.

Most of the species represented in the app have one or two kinds of color receptors, "cones", versus the three we possess. A couple primates with the same kind of color vision as us are included in the app just to show that we are not alone.

Zoomorph is an ongoing project. Future versions will simulate the vision of species that have more kinds of cones than us and thus have the ability to see many more colors, such as birds, fishes, crustaceans and insects. Future releases will also simulate sensitivity to ultraviolet light, acuity, peripheral vision, night vision and more experimental aspects of vision, such as magnetoception. In addition, non-scientific ways of learning about animal vision will be incorporated, such as knowledge created by telepathic animal communicators and shamanistic practitioners.

iPhone App in itunes iPhone App in itunes


App Support
The Zoomorph Project


Concept, Direction and Design: Lisa Jevbratt, Professor, Department of Art and The Media Art Technology Program, University of California Santa Barbara

Simulation Algorithm Programming: Javier Villegas, PhD, The Media Art Technology Program, University of California Santa Barbara

iOS App Programming: Charlie Roberts, PhD Candidate, The Media Art Technology Program, University of California Santa Barbara.



Made possible by a grant from Creative Capital