But where Envision’s passion really lies is in its smart glasses. The company developed Envision glasses using Google Glass’s Enterprise Edition. The glasses feature an 8-MP camera that can accurately describe a user’s current setting or scan and interpret entire documents.
Envision has brought its glasses to the CSUN Conference 2022, where they made their initial debut two years ago. This year, the company is introducing some new and improved technologies to its AI.
Envision’s smart glasses get some significant improvements
Karthik Kannan, co-founder of Envision says “Our mission is to improve the lives of the world’s two billion people who are blind or visually impaired by providing them with life-changing assistive technologies, products, and services.”
As part of that continuing mission, the company has introduced several improvements to its product at CSUN 2022.
Firstly, Envision has updated its smart glasses with document guidance for accurate capture. The update provides verbal guidance for positioning a document so the glasses can capture the entire document in one shot.
Additionally, there’s new layout detection that gives detailed guidance on the layout of text that is being read. It recognizes things like headers and photo captions to provide a more encompassing reading experience.
Envision has also added four new languages to its offline language library and improved its image capture and interpretation accuracy using tens of millions of data points.
Finally, the company has improved its innovative Ally features. The Ally feature lets users make private video chat call with trusted contacts who can help them visualize their surroundings.
The feature has been optimized to provide better quality video on both WiFi and mobile network connections.
This app has the potential to change lives
Envision’s app and smart glasses are innovative solutions that have the potential to change the lives of visually impaired individuals. And with the company’s AI constantly working, Envision will only get better over time.