Hyprsense Debuts Hyprmeet, Immersive Avatars for Video Calls

Share Article

Users can design a “digital self” to represent themselves online

Hyprsense, a company that develops AI and computer vision for facial motion capture, today announces the launch Hypermeet, a product that lets consumers create a fully virtual and immersive avatar of themselves for use in digital environments such as video call applications, such as Zoom and Discord, and games.

“Although people are spending more and more time meeting online, they don’t always want to show their faces,” said Jihun Yu, founder and CEO of Hyprsense. “Hypermeet lets them use a virtual version of themselves instead. With our real-time facial capturing technology, they can create and present a live virtual version of themselves that looks like them and conveys their natural movement.”

The digital-self world has boomed over the last few years with advancements in machine learning and computer graphics. People’s recognition, comfort, and need to operate in a digital world have given life to the world of 3D avatars. The ongoing pandemic has made people more accustomed to working from home, taking classes remotely, having dinner with family and friends online, and finding their identity in this digital world.

“Millennials and Gen Z have become accustomed to portraying themselves digitally on games such as Fortnight,” added Yu. “With video calls becoming the new socializing platform--for work, school and hanging out--those consumers want a better digital experience. Hyprmeet is focused on making this digital experience as immersive as possible.”

Using facial recognition technology, the platform captures facial expressions in real-time--with just a webcam--to create a dynamic 3D avatar, personalized to each user. From there, they can further personalize the experience by swapping out their background for one of Hypermeet’s virtual backgrounds.

Hypermeet uses Hypersense’s real-time facial tracking technology, Hyprface, which detects and tracks facial expression parameters as weight values and outputs the combination of all the values as a per-frame expression. It also calculates the eye gaze vector, faithfully animating the 3D character’s eye motion to add an additional layer of reality to the facial animation.

Hyprmeet is available for download here: http://www.hyprmeet.com.

Share article on social media or email:

View article via:

Pdf Print

Contact Author

Lisa Langsdorf
GoodEye Public Relations
+1 646-828-7415
Email >
Visit website