Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
The Microsoft Agent Framework provides support for several types of agents to accommodate different use cases and requirements.
All agents are derived from a common base class, AIAgent, which provides a consistent interface for all agent types. This allows for building common, agent agnostic, higher level functionality such as multi-agent orchestrations.
Important
If you use the Microsoft Agent Framework to build applications that operate with third-party servers or agents, you do so at your own risk. We recommend reviewing all data being shared with third-party servers or agents and being cognizant of third-party practices for retention and location of data. It is your responsibility to manage whether your data will flow outside of your organization’s Azure compliance and geographic boundaries and any related implications.
Simple agents based on inference services
The agent framework makes it easy to create simple agents based on many different inference services.
Any inference service that provides an Microsoft.Extensions.AI.IChatClient implementation can be used to build these agents. The Microsoft.Agents.AI.ChatClientAgent is the agent class used to provide an agent for any IChatClient implementation.
These agents support a wide range of functionality out of the box:
- Function calling
- Multi-turn conversations with local chat history management or service provided chat history management
- Custom service provided tools (e.g. MCP, Code Execution)
- Structured output
To create one of these agents, simply construct a ChatClientAgent using the IChatClient implementation of your choice.
using Microsoft.Agents.AI;
var agent = new ChatClientAgent(chatClient, instructions: "You are a helpful assistant");
For many popular services, we also have helpers to make creating these agents even easier. See the documentation for each service, for more information:
| Underlying Inference Service | Description | Service Chat History storage supported | Custom Chat History storage supported |
|---|---|---|---|
| Azure AI Foundry Agent | An agent that uses the Azure AI Foundry Agents Service as its backend. | Yes | No |
| Azure AI Foundry Models ChatCompletion | An agent that uses any of the models deployed in the Azure AI Foundry Service as its backend via ChatCompletion. | No | Yes |
| Azure AI Foundry Models Responses | An agent that uses any of the models deployed in the Azure AI Foundry Service as its backend via Responses. | No | Yes |
| Azure OpenAI ChatCompletion | An agent that uses the Azure OpenAI ChatCompletion service. | No | Yes |
| Azure OpenAI Responses | An agent that uses the Azure OpenAI Responses service. | Yes | Yes |
| OpenAI ChatCompletion | An agent that uses the OpenAI ChatCompletion service. | No | Yes |
| OpenAI Responses | An agent that uses the OpenAI Responses service. | Yes | Yes |
| OpenAI Assistants | An agent that uses the OpenAI Assistants service. | Yes | No |
Any other IChatClient |
You can also use any other Microsoft.Extensions.AI.IChatClient implementation to create an agent. |
Varies | Varies |
Complex custom agents
It is also possible to create fully custom agents, that are not just wrappers around an IChatClient.
The agent framework provides the AIAgent base type.
This base type is the core abstraction for all agents, which when subclassed allows for complete control over the agent's behavior and capabilities.
See the documentation for Custom Agents for more information.
Proxies for remote agents
The agent framework provides out of the box AIAgent implementations for common service hosted agent protocols,
such as A2A. This way you can easily connect to and use remote agents from your application.
See the documentation for each agent type, for more information:
| Protocol | Description |
|---|---|
| A2A | An agent that serves as a proxy to a remote agent via the A2A protocol. |
Azure and OpenAI SDK Options Reference
When using Azure AI Foundry, Azure OpenAI, or OpenAI services, you have various SDK options to connect to these services. In some cases, it is possible to use multiple SDKs to connect to the same service or to use the same SDK to connect to different services. Here is a list of the different options available with the url that you should use when connecting to each. Make sure to replace <resource> and <project> with your actual resource and project names.
| AI Service | SDK | Nuget | Url |
|---|---|---|---|
| Azure AI Foundry Models | Azure OpenAI SDK 2 | Azure.AI.OpenAI | https://ai-foundry-<resource>.services.ai.azure.com/ |
| Azure AI Foundry Models | OpenAI SDK 3 | OpenAI | https://ai-foundry-<resource>.services.ai.azure.com/openai/v1/ |
| Azure AI Foundry Models | Azure AI Inference SDK 2 | Azure.AI.Inference | https://ai-foundry-<resource>.services.ai.azure.com/models |
| Azure AI Foundry Agents | Azure AI Persistent Agents SDK | Azure.AI.Agents.Persistent | https://ai-foundry-<resource>.services.ai.azure.com/api/projects/ai-project-<project> |
| Azure OpenAI 1 | Azure OpenAI SDK 2 | Azure.AI.OpenAI | https://<resource>.openai.azure.com/ |
| Azure OpenAI 1 | OpenAI SDK | OpenAI | https://<resource>.openai.azure.com/openai/v1/ |
| OpenAI | OpenAI SDK | OpenAI | No url required |
- Upgrading from Azure OpenAI to Azure AI Foundry
- We recommend using the OpenAI SDK.
- While we recommend using the OpenAI SDK to access Azure AI Foundry models, Azure AI Foundry Models support models from many different vendors, not just OpenAI. All these models are supported via the OpenAI SDK.
Using the OpenAI SDK
As shown in the table above, the OpenAI SDK can be used to connect to multiple services.
Depending on the service you are connecting to, you may need to set a custom URL when creating the OpenAIClient.
You can also use different authentication mechanisms depending on the service.
If a custom URL is required (see table above), you can set it via the OpenAIClientOptions.
var clientOptions = new OpenAIClientOptions() { Endpoint = new Uri(serviceUrl) };
It's possible to use an API key when creating the client.
OpenAIClient client = new OpenAIClient(new ApiKeyCredential(apiKey), clientOptions);
When using an Azure Service, it's also possible to use Azure credentials instead of an API key.
OpenAIClient client = new OpenAIClient(new BearerTokenPolicy(new AzureCliCredential(), "https://ai.azure.com/.default"), clientOptions)
Once you have created the OpenAIClient, you can get a sub client for the specific service you want to use and then create an AIAgent from that.
AIAgent agent = client
.GetChatClient(model)
.CreateAIAgent(instructions: "You are good at telling jokes.", name: "Joker");
Using the Azure OpenAI SDK
This SDK can be used to connect to both Azure OpenAI and Azure AI Foundry Models services.
Either way, you will need to supply the correct service URL when creating the AzureOpenAIClient.
See the table above for the correct URL to use.
AIAgent agent = new AzureOpenAIClient(
new Uri(serviceUrl),
new AzureCliCredential())
.GetChatClient(deploymentName)
.CreateAIAgent(instructions: "You are good at telling jokes.", name: "Joker");
Using the Azure AI Persistent Agents SDK
This SDK is only supported with the Azure AI Foundry Agents service. See the table above for the correct URL to use.
var persistentAgentsClient = new PersistentAgentsClient(serviceUrl, new AzureCliCredential());
AIAgent agent = await persistentAgentsClient.CreateAIAgentAsync(
model: deploymentName,
name: "Joker",
instructions: "You are good at telling jokes.");
Simple agents based on inference services
The agent framework makes it easy to create simple agents based on many different inference services. Any inference service that provides a chat client implementation can be used to build these agents.
These agents support a wide range of functionality out of the box:
- Function calling
- Multi-turn conversations with local chat history management or service provided chat history management
- Custom service provided tools (e.g. MCP, Code Execution)
- Structured output
- Streaming responses
To create one of these agents, simply construct a ChatAgent using the chat client implementation of your choice.
from agent_framework import ChatAgent
from agent_framework.azure import AzureAIAgentClient
from azure.identity.aio import DefaultAzureCredential
async with (
DefaultAzureCredential() as credential,
ChatAgent(
chat_client=AzureAIAgentClient(async_credential=credential),
instructions="You are a helpful assistant"
) as agent
):
response = await agent.run("Hello!")
Alternatively, you can use the convenience method on the chat client:
from agent_framework.azure import AzureAIAgentClient
from azure.identity.aio import DefaultAzureCredential
async with DefaultAzureCredential() as credential:
agent = AzureAIAgentClient(async_credential=credential).create_agent(
instructions="You are a helpful assistant"
)
For detailed examples, see the agent-specific documentation sections below.
Supported Agent Types
| Underlying Inference Service | Description | Service Chat History storage supported | Custom Chat History storage supported |
|---|---|---|---|
| Azure AI Agent | An agent that uses the Azure AI Agents Service as its backend. | Yes | No |
| Azure OpenAI Chat Completion | An agent that uses the Azure OpenAI Chat Completion service. | No | Yes |
| Azure OpenAI Responses | An agent that uses the Azure OpenAI Responses service. | Yes | Yes |
| OpenAI Chat Completion | An agent that uses the OpenAI Chat Completion service. | No | Yes |
| OpenAI Responses | An agent that uses the OpenAI Responses service. | Yes | Yes |
| OpenAI Assistants | An agent that uses the OpenAI Assistants service. | Yes | No |
| Any other ChatClient | You can also use any other chat client implementation to create an agent. | Varies | Varies |
Function Tools
You can provide function tools to agents for enhanced capabilities:
from typing import Annotated
from pydantic import Field
from azure.identity.aio import DefaultAzureCredential
from agent_framework.azure import AzureAIAgentClient
def get_weather(location: Annotated[str, Field(description="The location to get the weather for.")]) -> str:
"""Get the weather for a given location."""
return f"The weather in {location} is sunny with a high of 25°C."
async with (
DefaultAzureCredential() as credential,
AzureAIAgentClient(async_credential=credential).create_agent(
instructions="You are a helpful weather assistant.",
tools=get_weather
) as agent
):
response = await agent.run("What's the weather in Seattle?")
For complete examples with function tools, see:
Streaming Responses
Agents support both regular and streaming responses:
# Regular response (wait for complete result)
response = await agent.run("What's the weather like in Seattle?")
print(response.text)
# Streaming response (get results as they are generated)
async for chunk in agent.run_stream("What's the weather like in Portland?"):
if chunk.text:
print(chunk.text, end="", flush=True)
For streaming examples, see:
Code Interpreter Tools
Azure AI agents support hosted code interpreter tools for executing Python code:
from agent_framework import ChatAgent, HostedCodeInterpreterTool
from agent_framework.azure import AzureAIAgentClient
from azure.identity.aio import DefaultAzureCredential
async with (
DefaultAzureCredential() as credential,
ChatAgent(
chat_client=AzureAIAgentClient(async_credential=credential),
instructions="You are a helpful assistant that can execute Python code.",
tools=HostedCodeInterpreterTool()
) as agent
):
response = await agent.run("Calculate the factorial of 100 using Python")
For code interpreter examples, see:
- Azure AI with code interpreter
- Azure OpenAI Assistants with code interpreter
- OpenAI Assistants with code interpreter
Custom agents
It is also possible to create fully custom agents that are not just wrappers around a chat client.
Agent Framework provides the AgentProtocol protocol and BaseAgent base class, which when implemented/subclassed allows for complete control over the agent's behavior and capabilities.
from agent_framework import BaseAgent, AgentRunResponse, AgentRunResponseUpdate, AgentThread, ChatMessage
from collections.abc import AsyncIterable
class CustomAgent(BaseAgent):
async def run(
self,
messages: str | ChatMessage | list[str] | list[ChatMessage] | None = None,
*,
thread: AgentThread | None = None,
**kwargs: Any,
) -> AgentRunResponse:
# Custom agent implementation
pass
def run_stream(
self,
messages: str | ChatMessage | list[str] | list[ChatMessage] | None = None,
*,
thread: AgentThread | None = None,
**kwargs: Any,
) -> AsyncIterable[AgentRunResponseUpdate]:
# Custom streaming implementation
pass