Robotic vision is ready for a makeover, and it’s likely going to get inspiration from cats. Feline eyes have unique qualities robotics experts are taking into consideration when developing advanced cameras and visual capabilities. A recent research study proves this link could revolutionize the future of robotics. What traits will robots take to become more catlike?
Experts recently released a report detailing how robotic vision systems could use biomimicry inspired by cats. The concept takes inspiration from biological systems, specifically cat eyes, to influence human-made innovations, like robots. The study’s equipment already incorporated an elliptical slit like a cat to clarify objects against backgrounds. It was time to see what more felines could offer.
The goal is to make a monocular system using only one camera to detect and identify objects in a space. Researchers needed to come up with a hardware solution to improve sight because depending on software was too energy-intensive. The design included a custom lens and a silicon photodiode array with reflectors to help manage light. Successfully refining item recognition with this setup would advance robotics.
Robotics engineers replicate these feline qualities in metal by discovering which optical features create their evolutionary advantages. For example, cats have a tapetum lucidum, a layer behind the retina humans don’t have. This allows cats to be the master hunters they are, no matter what time of day because it permits them to see in low light. They also have vertically elongated pupils for enhanced depth perception and retinal protection.2
The hope is the advancements will permit robots to adapt to conditions automatically. Most cameras require manual intervention to adjust settings to let in a certain amount of light or change the field of view. Feline eyes are dynamic, and eventually, robots will be too. Eventually, a camera’s functionalities will go beyond taking images to analyze — it will assess objects in real-time while storing the determinations for future vision training.
Embracing these characteristics in a robot has numerous boons. Honing photosensitivity was a primary objective. The robots needed to see as well as a cat does in lowlight conditions. It is able to focus on prey despite light spread or density. Translating this to a robot required metal reflectors. It helped the cameras see in brighter situations and in dim light.
The study theorized more potential applications of feline vision in their equipment. They wanted to take the light manipulation a step further. What if a robot could see camouflaged objects, regardless of visibility? Including feline traits in robotic vision allowed it to overcome its tendency to conflate pixels in too-dark and light conditions. With better target differentiation, fewer images looked like blobs of color.
The study was successful in proving cat eyes were valuable models for robots. The design was 52% better at absorbing light and adjusted its features based on the environment. The vertical pupil design also had higher accuracy in object identification than conventional models — 94.44% versus 88.8%, respectively.
Do these changes yield any disadvantages? Though software sapped the robot’s energy, it may not be as expensive as installing complex cameras. The price could be a barrier to the widespread implementation of feline-inspired robotic vision. However, the mechanical stress and energy consumption from earlier developments could potentially cost more in the long-term use of a robot compared to the upfront installation of advanced vision tech.
Improving robots’ eyesight is an obvious plus to field research, but it will only become a design and engineering staple if experts find practical ways to apply it. Will companies or households be incentivized to purchase and use a robot with catlike eyes?
Monocular camera systems can go into every type of robot, including autonomous vehicles (AVs) or drones. The array of machinery it can go in hints at how vast its personal and commercial applications could be. Additionally, artificial intelligence could make the robots’ abilities even more refined, with deep learning training them on more nuanced images.
Everyone from animal conservationists to archaeologists could take advantage of catlike cameras in robots. It might identify an artifact in the dark from a distance to help preserve relics at a digging site. Advanced underwater drones could survey distant habitats, deepening understanding in regions where knowledge gaps are prevalent. So long as the robot remains powered, scientists can see more in these unique environments than they ever could before.
It could advance AVs. Improved cameras, alongside the other tracking and detection features in these cars, could make them safer to be around. Novel cameras could make them better navigators, acknowledging obstacles and making precise turns promptly. It would be able to determine how to prioritize focus instead of giving the entire image equal clarity.
For example, if a pedestrian was jaywalking in the distance, this model could point this out and act accordingly instead of giving near-field objects higher precedent. It must know when to blur certain features and sharpen others.
Another primary use would be surveillance at all hours of the day. These cameras could track agricultural properties to keep signs of pests and predators. It is also deployable for defense, overseeing areas to make sure they are safe. This eliminates how much time and labor people need to invest in business-critical operations that improve safety.
Other studies have tried to see how other animal eyes could help robots. These include fish, crabs, birds and more. Clearly, experts are not finished exploring wildlife’s eyes and other senses to see how they could innovate robots of the future. This study opens doors to what else nature could provide technology.
Industry experts could soon use these insights to make robots more visually adept. Additionally, it could inspire countless subsequent research papers embracing other biological phenomena to enhance a robot’s potential. Robotics designers and more will soon be asking themselves what birds, fish, reptiles, and every creature in-between, will have to teach humanity about robotic modeling and engineering.