ChatPrompt Class
Core class for interacting with AI models through a prompt-based interface.
Handles message processing, function calling, and plugin execution. Provides a flexible framework for building AI-powered applications.
Initialize ChatPrompt with model and optional functions/plugins.
Constructor
ChatPrompt(model: AIModel, *, functions: list[microsoft.teams.ai.function.Function[Any]] | None = None, plugins: list[microsoft.teams.ai.plugin.AIPluginProtocol] | None = None)
Parameters
| Name | Description |
|---|---|
|
model
Required
|
AI model implementation for text generation |
|
functions
Required
|
Optional list of functions the model can call |
|
plugins
Required
|
Optional list of plugins for extending functionality |
Keyword-Only Parameters
| Name | Description |
|---|---|
|
functions
|
Default value: None
|
|
plugins
|
Default value: None
|
Methods
| send |
Send a message to the AI model and get a response. |
| with_function |
Add a function to the available functions for this prompt. Can be called in three ways:
|
| with_plugin |
Add a plugin to the chat prompt. |
send
Send a message to the AI model and get a response.
async send(input: str | UserMessage | ModelMessage | SystemMessage | FunctionMessage, *, memory: Memory | None = None, on_chunk: Callable[[str], Awaitable[None]] | Callable[[str], None] | None = None, instructions: str | SystemMessage | None = None) -> ChatSendResult
Parameters
| Name | Description |
|---|---|
|
input
Required
|
Message to send (string will be converted to UserMessage) |
|
memory
Required
|
Optional memory for conversation context |
|
on_chunk
Required
|
Optional callback for streaming response chunks |
|
instructions
Required
|
Optional system message to guide model behavior |
Keyword-Only Parameters
| Name | Description |
|---|---|
|
memory
|
Default value: None
|
|
on_chunk
|
Default value: None
|
|
instructions
|
Default value: None
|
Returns
| Type | Description |
|---|---|
|
ChatSendResult containing the final model response |
with_function
Add a function to the available functions for this prompt.
Can be called in three ways:
- with_function(function=Function(...))
- with_function(name=..., description=..., parameter_schema=..., handler=...)
- with_function(name=..., description=..., handler=...) - for functions with no parameters
with_function(function: Function[T]) -> Self
Parameters
| Name | Description |
|---|---|
|
function
|
Function object to add (first overload) Default value: None
|
|
name
Required
|
Function name (second and third overload) |
|
description
Required
|
Function description (second and third overload) |
|
parameter_schema
Required
|
Function parameter schema (second overload, optional) |
|
handler
Required
|
Function handler (second and third overload) |
Keyword-Only Parameters
| Name | Description |
|---|---|
|
name
|
Default value: None
|
|
description
|
Default value: None
|
|
parameter_schema
|
Default value: None
|
|
handler
|
Default value: None
|
Returns
| Type | Description |
|---|---|
|
Self for method chaining |
with_plugin
Add a plugin to the chat prompt.
with_plugin(plugin: AIPluginProtocol) -> Self
Parameters
| Name | Description |
|---|---|
|
plugin
Required
|
Plugin to add for extending functionality |
Returns
| Type | Description |
|---|---|
|
Self for method chaining |