This incredible scientific breakthrough, similar to that of the human eye, could lead to safer self-driving cars and better smartphone cameras
As you read this, your eyes are usually slowly scanning from left to right. But even when you’re not reading or looking at a fixed object, your eyes are constantly moving. This turns out to be the key to the quality of human vision and how robots, self-driving cars, and maybe even smartphones could see more clearly.
A team of researchers from the University of Maryland has created a camera that mimics human eye movements. The camera, called the Artificial Microsaccade-Enhanced Event Camera (AMI-EV), uses a rotating round wedge prism (round, but one side of the prism is sharply tilted) rotating in front of an event camera, in this case an Intel RealSense D435 camera, to move the images.
Although the movements are small, they are meant to mimic the saccades of the human eye. Saccades describe three different levels of movement that the eye makes: rapid, small twitches, slower eye movements, and microsaccades, which occur multiple times per second and are small enough to be imperceptible to the human eye.
This last movement can help us see more clearly, especially with moving objects. Here our eye has to move to place the image on the best part of our retina, replacing blurred images with shape and color.
Understanding how these micro-movements affect human perception, the team equipped its camera with a rotating prism.
According to the summary of the article“Inspired by microsaccades, we designed an event-based perception system capable of simultaneously maintaining low reaction time and stable texture. In this design, a rotating wedge prism was mounted in front of the aperture of an event camera to redirect light and trigger events.”
Researchers combined the hardware solution with software that could compensate for the motion and combine the captured images into a stable and clear image.
According to a report in Science DailyThe experiments were so successful that AMI EV-equipped cameras detected everything from fast-moving objects to the human heartbeat. That’s accurate vision.
Making robot eyes see more like humans not only opens up the possibility of robots that can share our visual abilities, but also, for example, self-driving cars that can finally distinguish between people and other objects. There is already evidence that Self-driving cars struggle to identify some peopleA self-driving Tesla equipped with an AMI-EV camera may be able to tell the difference between a bag blowing by and a child running down the street.
Mixed reality headsets, which are equipped with AMI EV cameras and use cameras to combine the real and virtual worlds, may be able to better combine them and thus provide a more realistic experience.
“…it has many applications that a large portion of the general public is already experiencing, such as autonomous driving systems or even smartphone cameras. We believe that our new camera system paves the way for more advanced and capable systems to come,” said Yiannis Aloimonos, a professor of computer science at UMD and a co-author of the study, Science daily.
It’s still early days and the hardware looks more like something you’d put in a computer system than the ultra-small and thin camera you’d want for the best smartphone.
Still, realizing that something we can’t see is responsible for what we do see, and how that small but important sense of vision can be replicated in robotic cameras, is an important step toward a future where robots rival human visual perception.