Summary
This article provides guidance to help customers understand the Azure Customer Solution provision in the Azure Product Terms. The Azure Customer Solution terms are for customers like Software Development Companies (SDCs) who use Azure to build their own solutions. This article is not designed to introduce new terms or supplement existing ones and is not part of your volume licensing agreement with Microsoft. It is intended to provide insight into this topic and the intent of the Azure Customer Solution to help you ensure your solution is compliant.
Applicable products
This document applies to the use of Microsoft Azure Services under the Azure Customer Solution clause in the Product Terms for Microsoft Azure.
Azure Customer Solution provisions
Compliant solutions
The Azure Customer Solution clause in the Product Terms for Microsoft Azure is intended to enable organizations like Software Development Companies (SDCs) (the “Customer”) to purchase Microsoft Azure Services through a commercial licensing agreement to develop and deliver unified solutions, such as a Software as a Service (SaaS) applications, that they sell to their end customers. To be compliant, your solution must add primary and significant functionality to the Microsoft Azure Services, as per the following definition of “Customer Solution” in the Product Terms:
Customer Solution means any application that the Customer makes available to its end users consisting of Customer’s applications and the Microsoft Azure Services, whereby Customer’s application adds primary and significant functionality and is not primarily a substitute for the Microsoft Azure Services. Customer applications that only provide billing, license management, and/or infrastructure services (e.g., virtual machines, containers, storage, or management for such infrastructure services) do not constitute “primary and significant functionality.”
- The solution is the primary driver for the end customer’s purchase decision. The fact that it runs on Azure may be a secondary consideration, but more likely, the cloud platform the solution runs on is irrelevant or unknown to the end customers.
- Azure costs make up less than half of the cost charged to the end customer for the solution.
- Individual meters and meter prices aren’t exposed to the end customer.
- Individual Azure services are used to host solutions that serve multiple end customers (for example, a virtual machine hosts an application serving multiple customers).
Non-compliant solutions
The Azure Customer Solution is not appropriate if you want to resell Microsoft Azure services—including with managed services, cost optimization, or enhanced support—without adding primary and significant functionality. If you are interested in reselling Azure Services with value-added services and support, evaluate the Cloud Solution Provider (CSP) program.
Provisions for Azure Model Inferencing APIs delivered by Azure AI Foundry
You may create and maintain a Customer Solution using model inferencing APIs for Microsoft Azure AI Foundry Models sold directly by Azure.
The Customer Solution requirement still applies to Microsoft Azure AI Foundry Models sold directly by Azure. Your Customer Solution should meet each of the following criteria:
- Resale is not allowed. You cannot simply resell the model inferencing APIs. Your use of the model inferencing APIs is to combine them with your own “primary” and “significant” contributions or functionality in your Customer Solution.
- Not merely a substitute. Your solution cannot merely provide raw, thinly wrapped, or lightly processed outputs from the model inferencing APIs.
- Either embed or adapt the APIs. Your solution must either embed the model inferencing APIs within your applications, tools, and workflows, or adapt the model inferencing APIs by substantially altering their output either through technology IP, data, context, or customizations unique your platform.
- Your contribution is core. Your contribution to Customer Solution represents the primary functionality driving the end-customer’s purchase decision. Any charges for the model inferencing APIs that you pass along to your end customers should make up less than half of the total cost of the solution to the end-customer.
- Solely for your platform. While you may permit your Customer Solution as a whole to be redistributed to downstream entities, the model inferencing APIs must remain within the confines of your platform and may not be distributed separately.
Providing access to the Azure Services
You may permit third parties to access and use Azure Services solely in connection with the use of your Customer Solution. You are responsible for:
• ensuring that third parties who access, use or distribute your solution comply with the Product Terms, the terms and conditions of your licensing agreement, and all applicable laws; and
• obtaining any necessary licenses related to Standards in a Customer Solution.
Example scenarios
Each scenario is unique, and you should evaluate your solution based on the Product Terms. Here are some example scenarios and whether they would be allowed under the Azure Customer Solution clause.
|
Scenario |
Allowed under Azure Customer Solution clause? |
|
A travel company has an online reservation system deployed in Azure that’s used by unaffiliated third parties (i.e., other travel companies) for the end customers of the travel company and end customers of the travel company’s unaffiliated third parties. |
Yes |
| A managed service provider adds a user interface (UI) to its Azure instance and claims the UI is a hosted solution. |
No |
| An infrastructure provider uses its own Azure tenant to provide managed infrastructure services to a third party or its end customers. |
No |
|
A reseller sells pure Azure Services to a customer. |
No |
|
An on-premises hoster offers an IaaS and/or SaaS solution to its end customers and wants to use Microsoft Azure Services licensed under the hoster’s internal-use commercial licensing agreement to provide security on the servers. |
Yes |
Example scenarios for Microsoft Azure AI Foundry Models sold directly by Azure
|
Scenario |
Allowed under Azure Customer Solution clause? |
| Platform tools include embedded Azure model inferencing APIs Example: A data platform provider embeds Azure OpenAI GPT-4o model APIs behind the scenes into its workflow automation tools. These APIs are used to generate UI screens based on natural language descriptions, rewrite code, suggest next steps, or assign tasks. |
Yes |
| Toolkit combines Azure model inferencing APIs alongside other APIs as part of workflows Example: A CRM platform provider offers a developer toolkit that includes its own APIs for identity resolution and segmentation, Azure Microsoft AI model APIs, and third-party APIs. The toolkit’s workflows enable developers to ingest customer data using third-party APIs, enrich it using the platform’s proprietary identity graph, and apply natural language and predictive learning via Azure Microsoft AI model APIs. The resulting application does not rely solely on Azure model inferencing APIs. |
Yes |
| Toolkit combines Azure model inferencing APIs alongside other platform capabilities as part of workflows Example: A cybersecurity platform provider offers a toolkit that orchestrates Azure xAI Grok 4 Fast model APIs to process and analyze incident data. These APIs are always used in conjunction with the customer’s platform-native orchestration logic and threat risk scoring machine-learning models. |
Yes |
| Toolkit integrates Azure model inferencing APIs with substantial prompt-level customizations for its platform Example: A data platform provider includes Azure OpenAI GPT-4o model APIs in its development toolkit but modifies the system prompt and adds orchestration or custom prompt engineering to shape the output for downstream use in its platform. |
Yes |
| Toolkit integrates Azure model inferencing APIs with substantial fine-tuning customizations for its platform Example: A logistics platform provider embeds Azure OpenAI GPT-4o model APIs into its operational toolkit and applies fine-tuning to optimize model behavior for supply chain-specific use cases, training the model using proprietary logistics data—such as delivery schedules, route efficiency metrics, and historical demand patterns—to shape its performance in tasks like predictive delivery timing, exception handling, and customer communication for downstream use in its platform. |
Yes |
| Toolkit integrates Azure model inferencing APIs with end-customer context from its platform Example: A data platform provider includes Azure OpenAI 5 model APIs in its development toolkit, and its workflows and orchestration consistently apply specific end-customer context from the customer’s platform to shape the output for downstream use in its platform. |
Yes |
| Toolkit integrates Azure model inferencing APIs with other platform capabilities as part of workflows Example: A supply chain platform provider includes Azure OpenAI GPT-4.1 model APIs in its development toolkit, but these are always leveraged in combination with the customer’s proprietary routing engine and platform APIs for warehouse data (e.g., to factor in real-time inventory information). |
Yes |
| Toolkit where model inferencing API charges are a minority of total cost Example: A financial analytics platform provider integrates Azure OpenAI GPT-4o model APIs into its suite, using APIs to generate summary reports and automate data categorization. The platform’s proprietary analytics engine, custom visualization dashboards, and workflow automation tolls are the main drivers of customer value. The cost breakdown shows that charges for Azure model inferencing APIs account for less than 20% of the total cost paid by the end customer. |
Yes |
| Toolkit exposes Azure model inferencing APIs that yield identical results Example: A platform provider includes Azure OpenAI GPT-5 model APIs in its toolkit and exposes them directly to end-customers, without any customizations or application of end-customer data or context to shape the output. |
No |
| Toolkit wraps Azure model inferencing APIs but doesn’t substantially change output Example: A customer includes Azure DeepSeek-R1 model APIs in its toolkit, changes the endpoint format, but returns the same results as Azure. |
No |
| Toolkit routes requests between different model inferencing API providers Example: A customer includes multiple Azure model inferencing APIs in its toolkit but merely routes requests between multiple inferencing providers (e.g., Azure OpenAI, Azure Grok, Anthropic, Meta) without modifying or shaping the output. |
No |
| Toolkit where model inferencing API charges are a majority of total cost Example: A SaaS provider offers a toolkit that exposes Azure OpenAI GPT-5 model APIs directly to end-customers for text generation and analysis, with only minimal branding and user interface changes. The customer’s toolkit adds some proprietary functionality, but the cost breakdown shows that charges for Azure model inferencing APIs account for more than 60% of the total cost paid by the end customer. |
No |
Frequently asked questions
See FAQ page