Communication inside the metaverse is immensely challenging. Getting the conversation right, detecting the right emotions and building empathy – how can a startup envision breaking the tech components and building an end-to-end experience? This week in #LaunchWithAI, we’re talking to Harold Dumur, founder and CEO of OVA, a member of Microsoft for Startups Founders Hub about StellarX, a metaverse builder.
What is StellarX?
“Welcome. I’m Biz, your virtual (non-human) learning coach.”
This introduction comes from an immersive environment created with OVA’s intuitive Metaverse builder StellarX, in collaboration with Desjardins Lab.
The project merges AI with spatial computing, machine learning, and user interface technologies. We created an intuitive, immersive experience to teach and test critical communication skills in a finance and insurance customer service context.
The call agents interact with a smart virtual assistant through scripted conversations based on real-world situations. The virtual trainer evolves contextually alongside the agents’ professional development.
How does contextual evolution work?
We dove into the Natural Language Processing (NLP) world by leveraging Azure Cognitive Services and their NLP framework named LUIS (Language understanding). By leveraging the tools offered by Microsoft and their Bot Framework, OVA was able to design a conversational flow that was later integrated into their metaverse building platform.
“The purpose of the virtual assistant is that it follows the call agent into real-world situations through real-time aid. For instance, the virtual assistant can monitor and analyze performance to give valuable feedback,” said Pierre-Luc Lapointe, R&D Director at OVA.
AI-driven OVA has been building a patent-pending visual scripting tool for the last couple of years that is now integrated into StellarX. The goal is to let users co-create with an AI interaction. By combining this project’s discoveries, Azure Cognitive, and our visual scripting tools, the team is aiming toward having a system that we call “intent-to-code.” This system, once deployed, will let users create interaction in the metaverse based on voice commands.
This coaching evolution is thanks to the automatic learning algorithms that feed the virtual assistant. Therefore, it can explore experiential personalization, predictive analytics, and recommend products or services.
How crucial is the AI emotional connection?
“I lost my credit card” sounds quite different from “I lost my credit card @#$%&!” Indeed, it is quite easy to identify the emotional intention behind a sentence. Here, the employee embodies an avatar in a virtual Space, and interacts with another avatar that belongs to the customer. A third avatar, the AI-coach, witnesses the session, gathering data for future feedback.
In this context, a significant challenge was approaching real-life conversations by identifying emotions. So, how precise can you get when categorizing feelings?
We needed to build an effective model to automatically label positive, neutral, and negative emotions in a customer-agent relationship context. We used Azure Cognitive Services for integration simplicity, reliability, and time-consuming savings. We also used Azure Cognitive services for its natural language understanding tools to research and develop both virtual assistant and intent-to-code features.
The precision of the automatic labeling of emotions was based on two distinct contexts:
- The neutral verbal expression of spoken words was analyzed under their textual form using NLU and NLP capabilities.
- Speech emotion recognition scrutinized the exact words, but this time, with vocal emotional intonation.
This is where Azure Cognitive Services worked as a full AI solution, to dissect and associate words with emotions under these call center conversations.
The question remained: How could we introduce essential soft skills for agents, such as adaptability, empathy, and resilience? The trainees should be able to understand, recognize, and acknowledge the client’s emotional status. Therefore, the accuracy of emotion detection, identification, and virtual rendering was essential.
“We decided to represent emotions under a scale-bar of 5 colors, with an emoji-faces style; from super angry (red), and gradually moving to completely satisfied (green). The customer’s avatar also changes color and behavior according to the triggered feeling. Naturally, the emotional changes are in sync with the conversation,” explained Lapointe.
This emotional aspect helped OVA integrate a virtual assistant into our platform StellarX, that can adapt itself to users based on how they feel.
What is your vision with the emotions trainer in the metaverse?
A rich literature about soft skills has been published to explain its relevance, especially in the workplace. Inclusive leadership, emotional intelligence, and good communication skills are sophisticated traits that are hard to detect, and even more challenging to teach.
Theory is one thing, but practice is another one. Constant exercise with valuable feedback is a luxury few companies can afford. Usually, employees get stuck on a best-practices manual with few chances to practice, while managers don’t have enough time to share helpful insights. That is until VR, guided by AI, becomes the game changing training tool.
VR is a fully immersive and interactive medium. It helps develop a deep and wide emotional range thanks to its intrinsic sense of presence. For instance, PwC’s soft-skills training report indicates, “VR-learners felt 3.75 times more emotionally connected to the content than classroom learners and 2.3 times more connected than e-learners.” VR can be adapted to endless soft skills scenarios, such as managers and sales training sessions.
One of the project’s goals was to develop a smart coach (AI-avatar companion), where AI can feed learning methods. Given the rich data collected from training scenarios and real-life calls, it is then possible to create a constantly improving digital education memory. This behavioral data library translates into better self-learning methods, and more sophisticated analytics about the participant’s performance.
Finally, the building of confidence would be one of the most remarkable results of a VR training project such as this one. According to the same PwC report, “40% of the VR-learners saw an improvement in confidence compared to classroom learners and 35% improvement over e-learners to act on what they learned after training in VR.”
Why did you choose Azure when building your AI project?
Through the course of this AI machine learning project, we discovered that the creation process from start to finish becomes more straightforward and faster when using Azure. As a startup, delegating external tasks and focusing on your expertise is paramount. Thanks to Azure’s solutions, we could concentrate on what’s essential: creation.
“Azure Cognitive Services speeded up the transition from experimentation to production, thanks to its user-friendly integration. Since then, Azure Cognitive has been a key tool for R&D to make Natural Language Understanding a key component in the research and development of human-centered immersive experiences,” said Lapointe.
Our XR creation tool, StellarX allows anyone to build virtual experiences, without coding skills. We believe that this tool is a bridge to the metaverse, from training sessions to research, brainstorming, and collaboration.
We’ve been on a mission to create ethical and human immersive experiences accessible to everybody. Thanks to StellarX, we are democratizing metaverse creation, allowing user-generated content to revolutionize entire industries, such as healthcare, construction, and defense.
For more tips on launching your AI startup, sign up today for Microsoft for Startups Founders Hub.