"Vergence labs' latest prototype was able to recognize my face with high accuracy out of a large sample space. Very impressive." ~Samuel Garrett [a Stanford Engineer]
Stanford, California (PRWEB) February 10, 2012
Vergence Labs, founded by Jon Rodriguez (engineer from Stanford University) and Erick Miller (MBA from UCLA and NUS), has created prototype augmented reality glasses that overlay data from the cloud, including a Facebook UI, onto a person's real life field of view. The glasses have cameras on the outside and an immersive computer that can access and display data from web applications on the inside. They can use the camera to do facial recognition of the friend being interacted with, as well as mixed reality and manipulation of reality with computer graphics. In the prototype, if a face being seen belongs to someone on the wearer's social friend list, then a UI appears displaying their name and latest status update. Google can then be accessed to do a broader web search about that person.
Commercialization of immersive wearable technology is at an early but exciting buzzing stage -- and has huge potential to positively redefine person-to-person social interaction using metrics and data "you generate" which can be shared on the website YouGen.Tv - many have said that this technology has “world changing” potential. Low-hanging immediate uses are: never forgetting the names of the acquaintances in one's social network, allowing one to see others' relationship status while they mingle, and playing online social “reality” games within an immersed 3D context. The prototype being developed specifies biometric interfaces, such as a brain computer interface, to control the computer with the mind, and to allow the users to learn and enhance their own behaviors based the feedback from these bio-signals, such as: thought signals, blood pressure, and even heart rate signals, as well as the ability to communicate other diverse, easily computer readable vital signs.
Though it might sound space-aged, “augmented reality” and “wearable” technology are uniting and starting to hit the market today. The path of Moore’s Law has already miniaturized computers to an easily wearable size and weight, while new form factors and clever uses of machine vision can enhance learning and face-to-face interaction between people to become more personal, expressive, and simply more humane. People can use biometric signals and gesture to control wearable computers; and the computers will interpret the multiple dimensions of data, then draw it directly into users' eyes in a way which is easy to understand and consume. Wide-spread communications, empowered by high-bandwidth diverse infrastructures of wireless networks, immersive transparent displays, graphics accelleration, and lightweight wearable sensors, will evolve to bring humans closer than ever to the utopian ideal of being able to directly exchange thoughts, even vivid visual ones, with those you love and care about.
About Vergence Labs, Inc.
Vergence Labs is a company founded by Erick Miller and Jon Rodriguez with the audacious intent to reinvent the future of the human-computer paradigm, having a mission to “Enhance Humanity. Redefine Reality.” The startup team has recently garnered much interest from Silicon Valley and Los Angeles media, technology and investment firms, although has not officially entered into a fundraising stage. The CEO states that they plan to begin fundraising during the month of March. Vergence Labs recently presented at the esteemed Stanford/MIT VLab event on wearable computers, and this week has the precious honor of multiple demos to the luminaries attending at NASA Ames for the Singularity University event on the future of med-tech. Find out more about Vergence Labs at http://www.vergencelabs.com/