April 9, 2018

From the Human Eye to AI: A Brief History of Vision

Table of Contents:

Read on as our VP of Product, Rajesh Talpade takes us on a short trek through time with his piece: From the Human Eye to AI: A Brief History of Vision

 

With all its abounding possibilities, Artificial Intelligence might seem like a technological leap for humanity, but in reality, it’s only our latest step in enhancing the way we see the world. With the eye being responsible for one of the five critical senses, it is one of the most complex and vital organs of the human body, and our survival through the ages often depended on its ability to detect danger or discern food sources. Still, while the human eye has evolved to be quite capable of color and emotion detection, unassisted human vision is still quite limited when compared to that of other animals like eagles, cats or hammerhead sharks, all of whom have better telescopic, peripheral and low-light vision. As such, throughout our history, we have had to augment our sight. And while A.I. is decidedly our most complex augmentation to date, even the simplest of advancements, like fire on a stick, were huge for our survival as a species.

 

Humans have had to be satisfied with unassisted vision until relatively recently, as the need for adequate natural-light, their line of sight, the size of and their distance from objects, and their visual acuity all acting as constraints. Below, we can identify some key innovations that helped overcome these challenges and highlight the significant impact that each has had on increasing the scientific knowledge-base.

 

As Asimov and others have ably illustrated, there is a strong correlation between an increase in the scientific knowledge-base and an improvement in the human quality of life. Humans overcame the challenge of limited vision in poor natural-light conditions when we learned to harness light sources. Beyond its importance for providing warmth and cooking food, when humans mastered the creation of and control over on-demand fire, this reduced the duration of the dark cycle and provided us with a lot more active waking hours to acquire and transmit knowledge. The consequential shorter sleep cycle also meant humans learned to sleep deeply, which may have contributed to our cognitive superiority over our closest relatives. Through bonfires, torches, oil lamps, candles and most recently, with the incandescent light-bulb, fire allowed for the expansion of human activities beyond the limitation of sunlight, but it did little to address refractive errors and visual impairment.

 

These limitations were instead overcome by advances in optical technology over the centuries, culminating in the invention of eyeglasses during the early 1300s which addressed the challenge of poor vision due to defects that cause nearsightedness, farsightedness and astigmatism. Eyeglasses further enabled humans to participate in and contribute their accumulated experiences to the collective knowledge-base even in the latter part of their lives, when age-deduced vision loss occurs. The lens was further utilized in the subsequent inventions of the microscope and the telescope in the early 1600s, enabling humans to see minute and distant objects with clarity far beyond that possible with the naked eye. These inventions helped spur numerous biological and geographical discoveries, both on earth and extraterrestrial, further increasing the scientific knowledge-base. Interestingly, the periscope was invented earlier in the 1430s to overcome the line-of-sight vision challenge, with the specific use-case of allowing pilgrims to see over crowds in front of them!

 

All of the above advancements has culminated into humanity harnessing technology and further amplifying our perception to see well beyond the physically obvious. The discovery of frequencies outside the visual spectrum during the late 1800s, for instance, led to the wave of radiological inventions, such as ultrasounds and X-Rays, that leveraged virtually the entire electromagnetic spectrum to see through animate and inanimate objects. This previously unimaginable capability increased our understanding of the internal structures of objects, contributing to our scientific knowledge-base and spurring a variety of medical and industrial applications. With the invention of the computer in the 1930’s and the further innovations in computing hardware and artificial intelligence algorithms, humans can now gain insights that are impossible to acquire with just human eyes, even where the images and videos are of the visual spectrum, or using any of our previous inventions. A.I., particularly computer vision, enables humans to see and discover hidden patterns in the large volumes of still images and videos that make up the deluge of data that we’ve accumulated over the last few years. If advances in other forms of A.I. continue at the current pace, it is not inconceivable to consider a near future where the visually-impaired may also benefit immensely from the likes of wearable computing and the brain-machine interface.

 

We are still in the early stages of Artificial Intelligence and are only just starting to understand its full range of benefits and challenges. That said, it is but another step forward humanity and not unreasonable to expect its impact on our quality of life to be as significant if not exceed that of all our previous vision-related technological advancements.

 

Thanks for sharing Rajesh!

 

Want to connect with Rajesh? Get in touch!
Linkedin: rtalpade
Twitter: RTalpade