Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
22 changes: 7 additions & 15 deletions examples/Python/ChatApp/README.md
Original file line number Diff line number Diff line change
@@ -1,21 +1,20 @@
# Azure App Configuration - Python ChatApp Sample

This sample demonstrates using Azure App Configuration to configure Azure OpenAI settings for a chat application built with Python.
This sample demonstrates using Azure App Configuration to configure Azure AI Foundry settings for a chat application built with Python.

## Features

- Integrates with Azure OpenAI for chat completions
- Integrates with Azure AI Foundry for chat completions
- Dynamically refreshes configuration from Azure App Configuration

## Prerequisites

- Python 3.8 or later
- Python 3.9 or later
- An Azure subscription with access to:
- Azure App Configuration service
- Azure OpenAI service
- Azure AI Foundry project
- Required environment variables:
- `AZURE_APPCONFIGURATION_ENDPOINT`: Endpoint URL of your Azure App Configuration instance
- `AZURE_OPENAI_API_KEY`: API key for Azure OpenAI (optional if stored in Azure App Configuration)

## Setup

Expand All @@ -29,15 +28,9 @@ This sample demonstrates using Azure App Configuration to configure Azure OpenAI
1. Configure your Azure App Configuration store with these settings:

```console
ChatApp:AzureOpenAI:Endpoint - Your Azure OpenAI endpoint URL
ChatApp:AzureOpenAI:DeploymentName - Your Azure OpenAI deployment name
ChatApp:AzureOpenAI:ApiVersion - API version for Azure OpenAI (e.g., "2023-05-15")
ChatApp:AzureOpenAI:ApiKey - Your Azure OpenAI API key (Optional only required when not using AAD, preferably as a Key Vault reference)
ChatApp:Model - An AI configuration entry containing the following settings:
- model - Model name (e.g., "gpt-35-turbo")
- max_tokens - Maximum tokens for completion (e.g., 1000)
- temperature - Temperature parameter (e.g., 0.7)
- top_p - Top p parameter (e.g., 0.95)
ChatApp:AzureAIFoundry:Endpoint - Your Azure AI Foundry project endpoint URL
ChatApp:ChatCompletion - An AI configuration entry containing the following settings:
- model - Model name (e.g., "gpt-5")
- messages - An array of messages with role and content for each message
ChatApp:Sentinel - A sentinel key to trigger configuration refreshes
```
Expand All @@ -46,7 +39,6 @@ This sample demonstrates using Azure App Configuration to configure Azure OpenAI

```bash
export AZURE_APPCONFIGURATION_ENDPOINT="https://your-appconfig.azconfig.io"
export AZURE_OPENAI_API_KEY="your-openai-api-key" # Optional if stored in Azure App Configuration
```

## Running the Application
Expand Down
58 changes: 20 additions & 38 deletions examples/Python/ChatApp/app.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,20 +4,19 @@
# license information.
# --------------------------------------------------------------------------
"""
Azure OpenAI Chat Application using Azure App Configuration.
Azure AI Foundry Chat Application using Azure App Configuration.
This script demonstrates how to create a chat application that uses Azure App Configuration
to manage settings and Azure OpenAI to power chat interactions.
to manage settings and Azure AI Foundry to power chat interactions.
"""

import os
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
from azure.identity import DefaultAzureCredential
from azure.appconfiguration.provider import load, SettingSelector, WatchKey
from openai import AzureOpenAI
from models import AzureOpenAIConfiguration, ChatCompletionConfiguration
from azure.ai.inference import ChatCompletionsClient
from models import AzureAIFoundryConfiguration, ChatCompletionConfiguration

APP_CONFIG_ENDPOINT_KEY = "AZURE_APPCONFIGURATION_ENDPOINT"


# Initialize CREDENTIAL
CREDENTIAL = DefaultAzureCredential()

Expand All @@ -43,15 +42,12 @@ def main():
)
configure_app()

azure_openai_config = AzureOpenAIConfiguration(
api_key=APPCONFIG.get("AzureOpenAI:ApiKey", ""),
endpoint=APPCONFIG.get("AzureOpenAI:Endpoint", ""),
deployment_name=APPCONFIG.get("AzureOpenAI:DeploymentName", ""),
api_version=APPCONFIG.get("AzureOpenAI:ApiVersion", ""),
azure_foundry_config = AzureAIFoundryConfiguration(
endpoint=APPCONFIG.get("AzureAIFoundry:Endpoint", "")
)
azure_client = create_azure_openai_client(azure_openai_config)
chat_client = create_chat_client(azure_foundry_config)

chat_conversation = []
chat_conversation = []

print("Chat started! What's on your mind?")

Expand All @@ -74,16 +70,14 @@ def main():
chat_messages.extend(chat_conversation)

# Get AI response and add it to chat conversation
response = azure_client.chat.completions.create(
model=azure_openai_config.deployment_name,
response = chat_client.complete(
model=CHAT_COMPLETION_CONFIG.model,
messages=chat_messages,
max_tokens=CHAT_COMPLETION_CONFIG.max_tokens,
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is currently removed due to a bug in the python sdk Azure/azure-sdk-for-python#44455

temperature=CHAT_COMPLETION_CONFIG.temperature,
top_p=CHAT_COMPLETION_CONFIG.top_p,
max_tokens=CHAT_COMPLETION_CONFIG.max_completion_tokens,
)

ai_response = response.choices[0].message.content
chat_conversation .append({"role": "assistant", "content": ai_response})
chat_conversation.append({"role": "assistant", "content": ai_response})
print(f"AI: {ai_response}")


Expand All @@ -96,27 +90,15 @@ def configure_app():
CHAT_COMPLETION_CONFIG = ChatCompletionConfiguration(**APPCONFIG["ChatCompletion"])


def create_azure_openai_client(azure_openai_config: AzureOpenAIConfiguration) -> AzureOpenAI:
def create_chat_client(config: AzureAIFoundryConfiguration) -> ChatCompletionsClient:
"""
Create an Azure OpenAI client using the configuration from Azure App Configuration.
Create a ChatCompletionsClient using the configuration from Azure App Configuration.
"""
if azure_openai_config.api_key:
return AzureOpenAI(
azure_endpoint=azure_openai_config.endpoint,
api_key=azure_openai_config.api_key,
api_version=azure_openai_config.api_version,
azure_deployment=azure_openai_config.deployment_name,
)
else:
return AzureOpenAI(
azure_endpoint=azure_openai_config.endpoint,
azure_ad_token_provider=get_bearer_token_provider(
CREDENTIAL,
"https://cognitiveservices.azure.com/.default",
),
api_version=azure_openai_config.api_version,
azure_deployment=azure_openai_config.deployment_name,
)
return ChatCompletionsClient(
endpoint=config.endpoint,
credential=CREDENTIAL,
credential_scopes=["https://cognitiveservices.azure.com/.default"],
)


if __name__ == "__main__":
Expand Down
16 changes: 7 additions & 9 deletions examples/Python/ChatApp/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,22 +4,19 @@
# license information.
# --------------------------------------------------------------------------
"""
Model classes for Azure OpenAI Chat Application.
Model classes for Azure AI Foundry Chat Application.
"""
from dataclasses import dataclass
from typing import List, Optional, Dict


@dataclass
class AzureOpenAIConfiguration:
class AzureAIFoundryConfiguration:
"""
Represents the configuration for Azure OpenAI service.
Represents the configuration for Azure AI Foundry service.
"""

api_key: str
endpoint: str
deployment_name: str
api_version: Optional[str] = None


@dataclass
Expand All @@ -28,8 +25,9 @@ class ChatCompletionConfiguration:
Represents the configuration for an AI model including messages and parameters.
"""

max_tokens: int
temperature: float
top_p: float
model: Optional[str] = None
max_completion_tokens: Optional[int] = None
reasoning_effort: Optional[str] = None
verbosity: Optional[str] = None
stream: Optional[bool] = None
messages: Optional[List[Dict[str, str]]] = None
Comment on lines 28 to 33
Copy link

Copilot AI Feb 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

messages is typed as Optional[...] = None, but the application logic expects a list of messages to always exist (it clones and extends it each turn). Consider changing this field to a non-optional list with a default_factory=list to reflect actual usage and avoid None at runtime.

Copilot uses AI. Check for mistakes.
2 changes: 1 addition & 1 deletion examples/Python/ChatApp/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
azure-identity
azure-appconfiguration-provider<3.0.0
openai
azure-ai-inference
Loading