FINAL PROJECT : Progress Report
How Animals See: Inside the Extraordinary World of Animal Vision
Short Description:
A web-based VR experience exploring the differences between human and animal vision by providing an artistic interpretation of how the world might appear to one of four animals: a bee, a dog, a bat, and a rattlesnake, as well as how the visual perception of each animal enhances their ability to find food or detect their predator or prey.
Long Description:
Not all animals see the world the way we do. Some species see brightly hued landscapes in colors invisible to human eyes, others can’t see color at all. Unlike shape or size, color is not an inherent property of an object but a result of the sensory system of the viewer. Every organism, then, lives in a somewhat unique world determined by its senses. How Animals See: Inside the Extraordinary World of Animal Vision is a web-based Virtual Reality experience exploring the differences between human and animal visual perception. An Android app built with Three.js allows viewers to see the world through the eyes of four animals: a bee, a dog, a rattlesnake, and a bat, by simply accessing their phone’s native camera and placing the device inside the Google Cardboard. The visual experience is further enhanced by a soundtrack unique to each scene.
Medical researchers often look at the animal kingdom to better understand our own anatomy. It is possible that by learning more about the animal vision we will be able to find new ways to cure human eye diseases or even enhance our own visual system. Attempts to emulate nature’s use of ultrasonic sound – send out high-frequency signals and analyze the time delay of the returning echoes, have already been made and the successful use of echolocation by humans can help blind people see merely by using sound. What I find especially interesting about exploring animal vision is its ability to help us build new technologies and improve current ones, i.e. more effective cameras, and better visual aids. Night vision can find its application in car technologies (driving aids) and camera optics. At the same time, the more we know about animals’ eyes the better we will understand the world they live in. By creating a first person experience and immersing viewers in the sights of animals I hope to create a sense of empathy and deeper understanding of these creatures.
The project stems from my interest in senses and human perception. I started working on my thesis project with a goal of experimenting with sensory illusions in VR. However, once I dived into the world of sensory experiences and perception, I became fascinated by the differences between the way we, humans, see the world, and the way other animals see it.
As a starting point of my project, I used a webcam and built custom filters that can analyze the camera feed and altered pixels to imitate an animal vision. My goal was to allow the viewer to see their own surrounding, rather than a natural environment of a given animal. I soon realized, however, that I may not be able to precisely recreate the vision of animals just through the image filters as many animals have the ability to see beyond our visible spectrum. Therefore, in addition to the webcam feed, I decided to model 3D environments with pre-calculated colors. This method allows for a more precise imitation of animal vision. While wearing a simple VR headset, viewers can experience their immediate surroundings through the eyes of one of four animals: a bee, a dog, a rattlesnake, and a bat. The app visualizes an artistic interpretation of how our own environment might appear to these animals and envisions how the visual perception of each animal enhances their ability to find food or detect the predator or prey.
I decided to focus on a dog, a bee, a rattlesnake, and a bat specifically because of the diversity in their visual perception abilities. Unlike humans who have three different color sensitive cone cells in their retina (red, green and blue) dogs’ vision is dichromatic and allows the animals to see only in shades of blue and yellow. Their vision is less sensitive to both brightness and variations in shades of gray. Bees, on the other hand, see a world literally hidden before our eyes. The range of light they can see is shifted further towards the violet end of the spectrum and further from the red. This means that bees can detect ultraviolet light, which helps them to find nectar in flowers. Rattlesnakes sense infrared thermal radiation, which allows them to "see" radiant heat at wavelengths between 5 and 30 μm to a degree of accuracy. This allows a blind rattlesnake to target vulnerable body parts of the prey at which it strikes with exact precision. Lastly, bats use echolocation, the use of sound waves and echoes to determine where objects are in space, to navigate and find food in the dark. All these distinct and incredible features constitute for an interesting mix of animal visual perception capabilities.
***
The first experience I will be focusing on is an impression of the sensory perception of a rattlesnake. First, I tested a thermal camera, Seek Compact, that connects directly to a cell phone. I wanted to shoot an outside environment, nature and animals in a park, to get a better sense of how a rattlesnake would see it.
The image quality I got from the FLIR turned out to be very poor. The camera did not work well outdoors. An option for creating the thermal imaging effect is to manually apply a color map to the models. I can create a filter that would imitate thermal imaging and apply it as a material. Since my goal is to create an “impression” of animal perception I think I could create a thermal image effect as long as it is fairly close to what the actual thermal image would look like.
To replicate the filter I loaded a reference image of a “Hot Metal Bar” color scheme into my p5.js sketch and mapped the colors to the pixels from a webcam stream. I mapped the colors according to the brightness of the pixels in the original video – the brighter RGB values are yellow (warmer values in the thermal image scale), the darker are blue/black (colder values).
***
Next, I worked on creating a filter replicating bee vision.
I adjusted the contrast of the image to replicate polarization which bees can detect. The following result is a JavaScript filter representing the color vision of bees within the human visible spectrum:
The interesting part of the bee vision is that bee can see beyond our visible spectrum into the ultraviolet light waves. Unfortunately, it is not possible to imitate the UV vision using filters and webcam feed, as humans are simply not built to perceive these colors. After a couple of unsuccessful trials, I realized that there is no single or combination of filters I could use to closely imitate what the ability to see within the UV spectrum would look like.
To better represent the bee vision I may have to move into a 3D environment where I can place objects with pre-rendered UV maps representing the ultraviolet vision.
***
The process of creating filters for the webcam feed does work well, however, for the animals that can see within the same visible color spectrum as humans, but can either perceive fewer colors or are more sensitive to a larger number of colors.
I decided to create an extra filter representing dog vision.
Dogs, contrary to popular belief, do not see the world in black-and-white. Their vision is actually most similar to people with red-green color blindness. But there are other ways humans differ from dogs as well, including less sensitivity to both brightness and variations in shades of gray.
Dogs are also very nearsighted compared to humans. A special test, custom-made for dogs, puts them at around 20/75 vision, according to Psychology Today. This means a human could barely see at 75 feet is what a dog can just about make out at 20 feet.
Dogs are substantially worse than humans at determining difference in brightness, or looked at another way, different shades of objects. In fact, dogs are twice two times worse at differentiating between shades than humans are.
To apply all these factors, I followed the same steps that I took while creating the bee vision filter:
***
To link all the filters together I need to create a home page with menu option from which the user will be able to choose which animal vision filter to view. So far I have been working on a design for couple of menu screens:
I created a simple menu for the 3 filters I made so far in Three.js:
NEXT STEPS:
- create the menu screen
- create the BAT vision using Web Speech API and Three.js
- redesign the BEE vision experience possibly by creating a 3D environment
- link the menu scene and animal icons with filters corresponding to each animal
- work on the look of the app (title / menu / navigation)
- make an Android app that can be downloaded onto a phone
HOMEWORK 7 - 3/25
Final Project Proposal:
PRESENTATION
THROUGH THE EYES OF AN ANIMAL: HOW ANIMALS PERCEIVE THE WORLD
Ever wondered what the world looks like from the point of view of a snake, fish or fly?
THROUGH THE EYES OF AN ANIMAL: HOW ANIMALS PERCEIVE THE WORLD is a mobile VR application built with WebVR and Three.js exploring sensory experiences from a point of view of an animal.
The project stems from my interest in senses and human perception. A couple of months ago I began my research on false memories, which led me to a number of interesting articles relating to the sense of smell and its influence on memory. I became interested in the way humans experience the physical world and the possibilities and limitations of our senses. While researching human perception I became fascinated with the way the world is perceived by different species depending upon how their sense organs and the nervous system interpret cues. I learned that the same world can appear very differently to other species and I became very curious how it would feel to “see the world” as other organisms “sees” it.
The idea for my final project is based on my thesis project at ITP. For my thesis, I will be creating a VR experience with custom made 3D environments inspired by animal perception. apart from the visual cues, I will be using sound and olfactory stimuli to further enhance the experience. I believe that a mobile application would allow more users to experience my project.
Not all animals have the same senses we do. Actually, the variety in sensual perception within the animal kingdom is quite vast.
The primary focus of my project is in visual perception. That’s because vision is a dominant sense of humans. Most of our information about the world comes in through our eyes. Vision is responsible for the control of almost all the basic actions necessary for living in our world.
The eye is a complex sensory organ, which enables visual perception of the world. Thus, the eye has several tissues that do different tasks. One of the most basic aspects of eye function is the sensitivity of cells to light and its transduction through the optic nerve to the brain. Different organisms use different ways to achieve these tasks. In this sense, the eye function becomes a very important evolutionary aspect as well and different animal models provide unique accessibility to eye experimentation.
Rattlesnakes, copperheads, and cottonmouths are all pit vipers. Pit vipers are snakes with two pits under their nostrils to detect heat, thus enabling them to hunt warm-blooded prey. The pits are so sensitive that the snake can determine the size of the warm-blooded animal and can even detect prey in complete darkness.
The snake's eyes can detect objects or movement from about 40 feet away, but its vision is much sharper when objects are closer. A rattlesnake's pupils are elliptical, not round, which enables the snake to see well in dim light. This is helpful for night hunting.
Honey Bees have binocular vision, so they can judge distances. They have colour vision, but see from orange to ultraviolet (UV) light. Insects don’t see red light, while we humans can’t see UV light. An object that reflects UV can appear quite different to an insect than it would to our eyes. Bees also sense light polarization. Sunlight scattering in the atmosphere or reflecting off water becomes polarized. That means, instead of being randomly aligned, the light waves tend to line up. This creates patterns in the sky that show directions, and makes water surfaces highly visible.
Echolocation, also called biosonar, is the biological sonar used by several kinds of animals, including dolphins. Basically, they emit sounds around them and then listen to the returning echo to locate and identify different objects or creatures around them.
HOMEWORK 6 - 3/11
Final Project Ideas - Brainstorming Process
1. Interactive Subway Map
NYURoute - 2D interactive city transit map
VanDam NYC Street Smart by VanDam - city map with 3D models of buildings and landmarks
"Pretty to look at but not very useful - by JeffIsler
Navigation is great if all you want to do is enjoy the cool 3D map. If you want to use it as a map, not so much. It shows a few buildings midtown, a few downtown, and a few in Brooklyn & Queens. You can't change the view elevation. Having tried it out, I can't say I'd ever open it again. I'm glad it was free."
2. Animal Vision
3. 3D Shopping Experience
I also would like to work with fashion advertisements in magazines and make it possible to see 3D models of advertised garments.
HOMEWORK 5 - 3/4
This week I continued working with Unity and Vuforia. My goal was to create a Unity project and make it accessible on Android. I decided to create an interactive subway map using Vuforia. The app would allow the user to scan a train number above a subway station and see the city map with the route of the train. I believe that the app could be helpful for those who are not fully familiar with the city's subway system. Drawing from my own experience as an international student living in New York, while visiting the city, many times I would come across a train station and wonder where the train could take me. I noticed that most of the subway maps are located inside train stations, usually on platforms, where they can be seen only after entering the station. The paper and online subway maps are sometimes difficult to decipher as they outline all available subway lines throughout the city.
I decided to start with creating an interaction for one subway line. I used the following logo of the 6 train as my image tracker.
I then created a simple Manhattan map with an outline of the 6 train route.
In Unity, I added a 3D model of a subway station and a sphere with a street number where the station is located.
My goal is to add key landmarks of New York to help the user with their understanding of the map (add key museums, buildings, parks).
The app would support all the New York City's subway lines. For each subway line there would be a full train route outlined on top of the map of the city with additional spheres with subway line numbers next to each station where transfer is available.
I saved this simple Unity test and changed the built environment to Android. Unfortunately, so far I haven't been able to get my app working on an Android phone. I am currently trying to troubleshoot the Android SDK and all the Unity settings.
HOMEWORK 4 - 2/25
This week I decided to take a deeper dive into Unity. I started watching Into to Unity tutorials on Linda.com and learned how to set up a simple project in Unity 3D with Vuforia. I found many example videos on the process on youtube.com.
For my first AR project using Vuforia and Unity, I will use a 3D model of the Tinkerbell. I rigged my model in Miximo and added a simple animation to it. I exported all as a .fbx file.
I will place the Tinkerbell on an image which I will track with Vuforia plugin for Unity.
Also, I really enjoyed the hologram tutorial that was posted on our class blog so I decided to make it myself at home. It was very simple to make but the result looked great!
HOMEWORK 3 - 2/19
LAYAR 3D MODEL CONVERTER / LAYAR GEO LAYERS / PHP MYADMIN
For this week's assignment using Layar Geo Layers, I wanted to create a jungle in the middle of the city covered in winter snow. I am originally from Poland and polish winters can be very severe. I wanted to "plant" a group of palm trees in front of my house in Warsaw, a taste of tropics for my warmth-deprived family, and ask my brother to take a picture of my work.
I started with preparing my model in Maya. I exported a .obj file of palm trees and an .mtl file. I then brought it to Layar 3D Model Converter to convert the .obj into .l3d, and place it on a map. I decided to first try to place the palms on my own street to test it and then move them to Warsaw.
I finally managed to make the palm trees appear in my room.
SMAAASH EXPERIENCE - BEER PONG VR
HOMEWORK 2 - 2/12
LAYAR APP
This week, using Layar, I decided to create a Valentine's Day add for Tiffany's. I was inspired by the AR shoe add we looked at last week. I wanted to create an add, in which AR would not only be an interesting addition but rather a necessary tool. I wanted to keep the printed add simple but with a hint of mystery that would be revealed by using the Layar app. I decided to go with Tiffany's because of their iconic color scheme.
The printed add for the Valentine's add for Tiffany's would consist of a closed blue gift box with a white ribbon and a Tiffany's logo. Once a user points their Layar camera to the box a video pops up in place of the gift box showing the box open and the jewelry inside revealed.
HOMEWORK 1 - 2/5
UE4 / TRINUS VR / GOOGLE CARDBOARD
For my first virtual reality experience, I wanted to create a space that might be impossible to explore in real life. I saw an episode of Planet Earth about caves the other day and I imagined how great it would be to see one of those amazing spaces in person.
I decided to try to build a simple cave using UE4. I looked for assets online and found some interesting examples of rocks, stalactites, and stalagmites. I also searched Unreal's Marketplace for materials and textures.
I downloaded a trial version of Trinus VR and used it with my android phone and Google Cardboard to see the cave in VR.
I also tried to create a more abstract environment using video as object material. I imported a couple of .mp4 videos into Unreal that I made before and used them as materials for the walls of my room.


- create the menu screen
- create the BAT vision using Web Speech API and Three.js
- redesign the BEE vision experience possibly by creating a 3D environment
- link the menu scene and animal icons with filters corresponding to each animal
- work on the look of the app (title / menu / navigation)
- make an Android app that can be downloaded onto a phone
HOMEWORK 7 - 3/25
Final Project Proposal:
PRESENTATION
THROUGH THE EYES OF AN ANIMAL: HOW ANIMALS PERCEIVE THE WORLD
Ever wondered what the world looks like from the point of view of a snake, fish or fly?
THROUGH THE EYES OF AN ANIMAL: HOW ANIMALS PERCEIVE THE WORLD is a mobile VR application built with WebVR and Three.js exploring sensory experiences from a point of view of an animal.
The project stems from my interest in senses and human perception. A couple of months ago I began my research on false memories, which led me to a number of interesting articles relating to the sense of smell and its influence on memory. I became interested in the way humans experience the physical world and the possibilities and limitations of our senses. While researching human perception I became fascinated with the way the world is perceived by different species depending upon how their sense organs and the nervous system interpret cues. I learned that the same world can appear very differently to other species and I became very curious how it would feel to “see the world” as other organisms “sees” it.
The idea for my final project is based on my thesis project at ITP. For my thesis, I will be creating a VR experience with custom made 3D environments inspired by animal perception. apart from the visual cues, I will be using sound and olfactory stimuli to further enhance the experience. I believe that a mobile application would allow more users to experience my project.
The primary focus of my project is in visual perception. That’s because vision is a dominant sense of humans. Most of our information about the world comes in through our eyes. Vision is responsible for the control of almost all the basic actions necessary for living in our world.
The eye is a complex sensory organ, which enables visual perception of the world. Thus, the eye has several tissues that do different tasks. One of the most basic aspects of eye function is the sensitivity of cells to light and its transduction through the optic nerve to the brain. Different organisms use different ways to achieve these tasks. In this sense, the eye function becomes a very important evolutionary aspect as well and different animal models provide unique accessibility to eye experimentation.
The snake's eyes can detect objects or movement from about 40 feet away, but its vision is much sharper when objects are closer. A rattlesnake's pupils are elliptical, not round, which enables the snake to see well in dim light. This is helpful for night hunting.
HOMEWORK 6 - 3/11
Final Project Ideas - Brainstorming Process
1. Interactive Subway Map
NYURoute - 2D interactive city transit map
VanDam NYC Street Smart by VanDam - city map with 3D models of buildings and landmarks
"Pretty to look at but not very useful - by JeffIsler
Navigation is great if all you want to do is enjoy the cool 3D map. If you want to use it as a map, not so much. It shows a few buildings midtown, a few downtown, and a few in Brooklyn & Queens. You can't change the view elevation. Having tried it out, I can't say I'd ever open it again. I'm glad it was free."
2. Animal Vision
3. 3D Shopping Experience
I also would like to work with fashion advertisements in magazines and make it possible to see 3D models of advertised garments.
HOMEWORK 5 - 3/4
This week I continued working with Unity and Vuforia. My goal was to create a Unity project and make it accessible on Android. I decided to create an interactive subway map using Vuforia. The app would allow the user to scan a train number above a subway station and see the city map with the route of the train. I believe that the app could be helpful for those who are not fully familiar with the city's subway system. Drawing from my own experience as an international student living in New York, while visiting the city, many times I would come across a train station and wonder where the train could take me. I noticed that most of the subway maps are located inside train stations, usually on platforms, where they can be seen only after entering the station. The paper and online subway maps are sometimes difficult to decipher as they outline all available subway lines throughout the city.
I decided to start with creating an interaction for one subway line. I used the following logo of the 6 train as my image tracker.
My goal is to add key landmarks of New York to help the user with their understanding of the map (add key museums, buildings, parks).
The app would support all the New York City's subway lines. For each subway line there would be a full train route outlined on top of the map of the city with additional spheres with subway line numbers next to each station where transfer is available.
I saved this simple Unity test and changed the built environment to Android. Unfortunately, so far I haven't been able to get my app working on an Android phone. I am currently trying to troubleshoot the Android SDK and all the Unity settings.
HOMEWORK 4 - 2/25
This week I decided to take a deeper dive into Unity. I started watching Into to Unity tutorials on Linda.com and learned how to set up a simple project in Unity 3D with Vuforia. I found many example videos on the process on youtube.com.
For my first AR project using Vuforia and Unity, I will use a 3D model of the Tinkerbell. I rigged my model in Miximo and added a simple animation to it. I exported all as a .fbx file.
Also, I really enjoyed the hologram tutorial that was posted on our class blog so I decided to make it myself at home. It was very simple to make but the result looked great!
HOMEWORK 3 - 2/19
LAYAR 3D MODEL CONVERTER / LAYAR GEO LAYERS / PHP MYADMIN
For this week's assignment using Layar Geo Layers, I wanted to create a jungle in the middle of the city covered in winter snow. I am originally from Poland and polish winters can be very severe. I wanted to "plant" a group of palm trees in front of my house in Warsaw, a taste of tropics for my warmth-deprived family, and ask my brother to take a picture of my work.
I finally managed to make the palm trees appear in my room.
SMAAASH EXPERIENCE - BEER PONG VR
HOMEWORK 2 - 2/12
LAYAR APP
This week, using Layar, I decided to create a Valentine's Day add for Tiffany's. I was inspired by the AR shoe add we looked at last week. I wanted to create an add, in which AR would not only be an interesting addition but rather a necessary tool. I wanted to keep the printed add simple but with a hint of mystery that would be revealed by using the Layar app. I decided to go with Tiffany's because of their iconic color scheme.
The printed add for the Valentine's add for Tiffany's would consist of a closed blue gift box with a white ribbon and a Tiffany's logo. Once a user points their Layar camera to the box a video pops up in place of the gift box showing the box open and the jewelry inside revealed.
HOMEWORK 1 - 2/5
UE4 / TRINUS VR / GOOGLE CARDBOARD
For my first virtual reality experience, I wanted to create a space that might be impossible to explore in real life. I saw an episode of Planet Earth about caves the other day and I imagined how great it would be to see one of those amazing spaces in person.
I decided to try to build a simple cave using UE4. I looked for assets online and found some interesting examples of rocks, stalactites, and stalagmites. I also searched Unreal's Marketplace for materials and textures.
I downloaded a trial version of Trinus VR and used it with my android phone and Google Cardboard to see the cave in VR.


thank you ....... this really helped me .......
ReplyDeleteinspiring once ........ nice ........
visit also my website please ........
Cara Mengobati Sipilis
Obat Untuk Sakit Sipilis
Obat Untuk Penyakit Sipilis
Pengobatan Penyakit Sipilis
Obat Untuk Menyembuhkan Penyakit Sipilis