Animals and Birds Use Retinal Jitter Strategies To Focus

Have you ever wondered how you can walk or jog, with your head bouncing up and down, while still focusing on an object either nearby or far away? Have you noticed how you can do the same and judge distance, speed of object, and minute details of that object quickly and accurately? Well, the reason you can do this so well is how the mind using frame bursting of images from your memory, and retinal jitter to help you quickly fill in the details, meanwhile your visual cortex fills in the blanks – all this happening in micro-seconds using a brain that is barely drawing 20-watts of power. Wow, talk about a state-of-the-art organic design and technology – impressive my fellow human.

Of course, some animals and birds do this even better than we do, with much smaller brains. Consider if you will an owl, hawk, or bald-eagle. The phrase “Eagle Eyes” is apropos here, think about it. Using biomimicry strategies perhaps we can make our UAV (unmanned aerial vehicle) or drone video imaging more powerful and acute – and in doing so, consider for a moment the number of applications this will affect? How are we doing so far with these concepts? Well, 3-axis gimbals are the most sought by small drone owners, but why have a 3-axis if you can make a 4,5,or 6-axis gyro stabilization gimbal for better video resolution and accuracy. That would certainly assist in stabilizing the video camera, so too do quad copter designs which are quite stable even in moderate turbulence.

Let’s talk about strategies for a moment – to get to that eagle eye ability we see in nature. One patent, “Apparatus and methods for stabilization and vibration reduction,” US 9277130 B2, duly states: “Currently, there exists primarily four methods of vibration dampening commonly employed in photography and videography to reduce the effects of vibration on the picture: software stabilization, lens stabilization, sensor stabilization, and overall shooting equipment stabilization.”

What if we also work with visual recognition systems for frame bursting, only focusing on things that meet our mission criteria, “OR” are complete anomalies (out of place). In a human mind, things out of place often trigger the N400 brain wave, evoking curiosity, nuance, or interest. We can program the same using algorithms requiring the video camera to; investigate, identify and act. Or, as Colonel Boyd’s “OODA Loop Strategy” suggests: Observe, Orient, Decide, and Act. And the fighter pilot who can do that quickest should win the aerial dog-fight provided they make good use of their energy and air-speed. Good advice, even if we borrow it to discuss how best to program a UAS (unmanned aerial system) to complete a task or mission.

In one paper ” Model-based video stabilization for micro aerial vehicles in real-time,” the abstract states; “The emerging branch of Micro aerial vehicles (MAVs) has attracted a great interest for their indoor navigation capabilities, but they require a high quality video for tele-operated or autonomous tasks. A common problem of on-board video quality is the effect of undesired movement, and there are different approaches for solving it with mechanical stabilizers or video stabilizer software. Very few video stabilizer software can be applied in real-time and their algorithms do not consider intentional movements of the tele-operator.

Leave a Reply

Your email address will not be published. Required fields are marked *