SHOW ALL

Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
Metaverse News

How AI would solve the metaverse’s language problem

One of the most fundamental aspects of the metaverse and also the metaverse news is its inherent democratic nature. Everyone on the planet, in theory, has access to the same virtual world(s), and geography does not prevent them from doing so.

Except one. It may be difficult for people who meet in the metaverse to share an experience if they do not speak the same language. There are good translation services, such as Google Translate and Skype Translator, but the issue is scale: these services are often designed for one-on-one talks, whereas a metaverse encounter typically involves dozens or even hundreds of people. It’s difficult to know what to do when everyone speaks their own language.

Here comes Onemeta AI. Its Verbum service, which will be demonstrated for the first time at CES 2023, can translate in real time between up to 50 people who speak different languages (it supports 82 and 40 dialects, the company says). Not only can it provide real-time transcripts, but it may also provide the voice.

“You could have 50 people on a Zoom call, and they could each have their own native tongue,” says David Politis, a representative for Onemeta. “They would hear someone speak in Japanese, but they would then hear it in English or in Italian or in Russian and onscreen they would see it in their language as well.”

We were able to try out Verbum on Thursday night at CES. We used a headset to communicate with a woman in Central America, and the technology translated what we said into Spanish and what she said into English. Despite the minor hiccup, the chat moved smoothly and felt natural. The AI voice, which sounded just as excellent as the TikTok lady, appeared around a second after the sentences were spoken and transcribed.

Onemeta is first focusing Verbum on international team meetings, but the service may also be utilized for metaverse experiences: Consider a massively multiplayer online role-playing game (MMORPG) like Call of Duty, where players from all over the world want to communicate with each other in real-time, or an esports event where people want to watch the action while also talking to each other.

“The most commonly spoken language is English,” Politis says. “But if your native language is Portuguese or Russian, your English is rarely going to be the same as your native language. And so there is going to be miscommunication — it’s just going to happen. We can eliminate almost all of that.”

There is a clear demand for what Onemeta is giving with Verbum, but its success will be determined by if others, particularly Microsoft and Google, who have resources that Onemeta does not, rise to the same challenge.

Latest metaverse news and tutorials right at your inbox, every Monday

IMPORTANT DISCLAIMER: All content provided on this website, any hyperlinked sites, social media accounts and other platforms is for general information only and has been procured from third party sources. We make no warranties of any kind regarding this content. None of the content should be interpreted as financial, legal, or other advice meant to be relied on for any purpose. Any use or reliance on this content is done at your own risk and discretion. It is your responsibility to conduct research, review, analyze, and verify the content before relying on it.

About MahKa

How AI would solve the metaverse’s language problemMahKa loves exploring the decentralized world. She writes about NFTs, the metaverse, Web3 and similar topics.

Recommended Posts