This is the Trace Id: 811ef9a0a24e5bca331877904cd7c4be
7/25/2025

Audi AG builds secure, scalable AI assistant in just two weeks with Azure

Facing rising employee demand for faster access to information, AUDI AG needed a smarter way to deliver answers—without increasing support overhead or compromising security.

To move quickly with confidence, AUDI AG used Microsoft solutions—including Azure AI Foundry, Azure App Service, and Azure Cosmos DB—to deploy a secure, scalable foundation for enterprise AI.

In just two weeks, AUDI AG deployed its first AI-powered assistant and is now expanding the same scalable framework to launch eight additional agents across the enterprise.

Audi AG

 AUDI AG is known worldwide for redefining what's possible in automotive design, and that same spirit of innovation drives its internal operations. As employee expectations for fast, seamless access to information grew, Audi leaders saw an opportunity to modernize how support services were delivered across the enterprise.  

Rather than adding more manual processes or increasing operational overhead, Audi wanted to empower employees with real-time access to critical information—anytime, anywhere. To meet rising demand and scale support efficiently, the company set an ambitious goal: deliver a production-ready, AI-powered self-service experience in weeks, not months.  

“To provide an enhanced employee experience and relieve operational HR from answering easy questions, it became clear that employees needed convenient access to information anytime they needed it,” says Melissa Wischner, HR Digitalization at Audi. “Our goal was to create a self-service solution that delivers accurate, specific, and fast answers to employee questions 24/7.” 

Audi puts innovation to work inside the enterprise   

To improve employee access to information, Audi accelerated chatbot development using several Microsoft solutions—including Azure AI Foundry, Azure App Service, and Azure Cosmos DB. Azure AI Foundry provided a customizable architecture that helped the team move fast while meeting enterprise requirements for security, scalability, and compliance.  Azure Functions acts as a connector between Azure Key Vault and Azure Cosmos DB, enabling secure, event-driven data flow within Audi’s chatbot framework.

“We had a lot of expertise in AI and web applications, but we had never built an Azure application before,” notes Hendrik Drath, Solution Architect, AI and Big Data at Audi. “I found a solution accelerator that provided everything we needed—infrastructure code, back-end orchestration, and a ready-to-go front end.”  In just two weeks, Audi launched its first AI-powered assistant, the Smart Assistant for HR, delivering real-time, natural-language answers with enterprise-grade performance.  

Apoorva Suresh, Machine Learning Engineer on AI Enterprise Team, AUDI AG

“The Azure accelerator helped us skip months of foundational work. It gave us a solid base, so we could build quickly and roll out a proof of concept for an enterprise use-case.”

Apoorva Suresh, Machine Learning Engineer on AI Enterprise Team, AUDI AG

Moving fast, staying secure with enterprise AI   

Audi knew that accelerating development with confidence required the right foundation.  

“The Azure accelerator helped us skip months of foundational work. It gave us a solid base, so we could build quickly and roll out a proof of concept for an enterprise use-case,” says Apoorva Suresh, Machine Learning Engineer on Audi’s AI Enterprise Team. 

Audi’s chatbots run on a tightly integrated Azure stack that delivers security, scale, and real-time performance. Azure App Service provides the managed infrastructure Audi needs for production workloads, including built-in authentication, telemetry, and compliance monitoring. This simplified operations by reducing infrastructure complexity, helping Audi’s development team focus on rapid innovation and efficient releases.

To power intelligent, real-time responses, Audi relies on two products in Azure AI Foundry:  Azure OpenAI in Foundry Models and Azure AI Search. Together, they surface accurate answers from internal sources, including SharePoint and Yammer, using a retrieval-augmented generation (RAG) approach. 

“RAG forms the core logic of our system,” Suresh shares. “We use Azure AI Search to store and retrieve vectorized content. Azure OpenAI generates the final response based on the retrieved context.”

For managing millions of discussions in large communities, the Microsoft Viva Engage connector operates with Azure Cosmos DB. The process starts with a bulk upload of existing discussions, followed by a caching and high watermark system that syncs only new discussions. This approach ensures fast, efficient updates for Yammer or Viva Engage, keeping information current and accessible. 

Azure Cosmos DB will also serve as the backbone for developing a performance cockpit, enabling chatbot customers to monitor key performance metrics and identify opportunities for improving data quality. 

Hendrik Drath, Solution Architect, AI and Big Data, AUDI AG

“We put different compliance controls in place using Azure App Service and other tools, and even used Azure OpenAI to enforce non-compliant answer filtering. Security wasn’t an afterthought—it was built into the foundation.”

Hendrik Drath, Solution Architect, AI and Big Data, AUDI AG

To safeguard operations, Audi used the native security toolset in Azure, including Microsoft Defender for Cloud and Azure Key Vault. From encryption to compliance scoring, the team had full operational visibility to meet security requirements across multiple internal stakeholders.  

“We put different compliance controls in place using Azure App Service and other tools and even used Azure OpenAI to enforce non-compliant answer filtering. Security wasn’t an afterthought—it was built into the foundation,” explains Drath.  

The following diagram shows the architecture of the solutions Audi is using to create its new chatbots. The user accesses the chatbot with a request for information, with Azure Cosmos DB, Azure AI Search, and Azure OpenAI playing pivotal roles in the flow and retrieval of data.

Navigating the road ahead with confidence  

Audi’s modular, production-grade architecture allows teams to reuse proven patterns for further large language model (LLM) solutions while adapting to each department’s unique needs. After the success of the Smart Assistant for HR, Audi launched a second chatbot for its Enterprise Helpdesk, extending the same framework to IT support scenarios.  

“The experience of getting answers is night and day. Working with Microsoft means we’re saving time for employees and operational HR,” says Wischner. 

That success is now fueling enterprise-wide expansion. Audi is building more LLM solutions across different business areas—each tailored to business-specific data and workflows but sharing a common foundation for deployment, security, and compliance. 

“We’ve been working with Azure for quite some time now, and we’re seeing even greater strides recently,” adds Suresh. “Microsoft understands our needs—and it’s only getting easier to scale new use cases into production.” 

Melissa Wischner, HR Digitalization, AUDI AG

“The experience of getting answers is night and day. Working with Microsoft means we’re saving time for employees and operational HR.”

Melissa Wischner, HR Digitalization, AUDI AG

What Audi learned: Start small. Build smart. Hit the gas.   

Audi’s approach wasn’t just about building a chatbot. It was about building trust across the organization. By delivering fast, secure, and scalable access to information, the team earned confidence from both leadership and employees.   

The success of the first chatbots did more than validate the technology. It reinforced that enterprise-grade AI can be deployed responsibly, at speed, offering clear lessons for business and technical decision-makers:   

  • Start with a real business challenge: HR’s use case was urgent and actionable.   

  • Build on proven frameworks: Azure AI Foundry, Azure App Service, and Azure Cosmos DB provided a strong starting point, enabling the AI Enterprise Team to efficiently develop a robust, modular framework tailored to Audi's specific enterprise needs. 

  • Architect for scale: A modular stack helped ensure fast, secure deployment across departments.  

With multiple chatbots live and more in development, Audi is preparing for next-generation capabilities, such as multi-agent orchestration and deeper integration into business systems. This isn’t experimentation. It’s operational AI at enterprise scale in weeks.  

“We’re already seeing the relief. Automating answers lifts a huge weight off internal teams and helps employees get what they need, fast. With secure, lean solutions like Azure, it’s easy to scale what works and focus on what matters,” says Drath.  

Discover more about AUDI AG on Facebook, Instagram, LinkedIn, and YouTube. 

Take the next step

Fuel innovation with Microsoft

Talk to an expert about custom solutions

Let us help you create customized solutions and achieve your unique business goals.

Drive results with proven solutions

Achieve more with the products and solutions that helped our customers reach their goals.

Follow Microsoft