The white-gold or blue-black dress. Not since Kate Middleton strode down the aisle clad in Alexander McQueen has a single dress taken over the Internet with such heated debate. Other than just giving rise to a viral sensation, what Roman Original’s dress really shone light on were the differences between how people everywhere perceive colors.
After all, color is all a matter of perception.
This is nothing new – scientists have known for ages that some people see colors differently. In fact, we may have only just started seeing certain colors – such as blue. Even though the knowledge may be old news, it often takes something as seemingly insignificant as a dress to bring the real facts to life for many.
From daily health tracking to helping airline pilots plot routes, wireless technology plays an ever-increasing role in our lives. Enhancing our visual experience, and how we see the world around us, is no exception.
When it comes to assisting with general types of visual obstacles, often white balance makes the big difference. Take, for instance, the dress. Eyes and cameras struggle with white balance. People’s natural eyesight can normally tell the difference between a blue-black dress and a white-gold dress because there is some clue in the environment that tells them the color of the light illuminating the object.
Unfortunately with the dress, there was a lack of information about where the illumination source originated. Without that information, our eyes have difficulty determining color. Though never foolproof, smartphone camera lenses can remove some of this challenge through their manual white balance settings. By simply adjusting these settings, the camera may be able to adjust the light so the blue dress is in fact blue – mimicking the natural functions of the human eye.
But this isn’t all today’s smartphones can offer in the realm of visual assistance…
Just last week Verizon announced a revolutionary new technology called VelaSense by Visus would be available exclusively to Verizon Wireless customers with Android smartphones. This first-of-its-kind technology is designed for customers with any degree of visual impairment, from low vision to complete blindness. The technology allows visually impaired people to better identify colors to pick out clothes, read labels or menus, or determine accurate money in their hand. It will also make it easier for users to navigate their mobile devices.
Through its new Lollipop 5.0 OS, Android is also working to provide visually impaired smartphone users with a better experience. High Contrast Text will highlight items that are normally difficult to read through the use of outlines that are easier to see. Color Inversion will flip the entire color scheme on the device to the inverse, and Color Correction will give users the ability to change the display based on different types of color blindness such as Deuteranomaly (red-green), Protanomaly (red-green) and Tritanomaly (blue-yellow).
Soon, there also may be smartphone apps that help visually impaired people avoid obstacles in their paths, called Aerial Object Detection.
Good Old-Fashioned Aperture
In the meantime, while these new technologies roll out, sometimes a smartphone’s camera lens can be just as helpful. By looking through the camera and using the Zoom function, users with near sightedness can view objects further out more clearly. Most Android and iOS cameras also offer the ability to manually focus on different areas viewed through the lens by tapping on the smartphone’s display. This helps to reorient the focus so users with difficult eyesight can see detail more easily. By tapping your screen you can dramatically change the outcome of a photo by changing the focus and its light source. Try it yourself!
Image via Tumblr.
For more Verizon Wireless news, subscribe via RSS feeds in the right rail.