1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119
|
# Azure AI Agent Server Adapter for Python
## Getting started
```bash
pip install azure-ai-agentserver-core
```
## Key concepts
This is the core package for Azure AI Agent server. It hosts your agent as a container on the cloud.
You can talk to your agent using azure-ai-project sdk.
## Examples
If your agent is not built using a supported framework such as LangGraph and Agent-framework, you can still make it compatible with Microsoft AI Foundry by manually implementing the predefined interface.
```python
import datetime
from azure.ai.agentserver.core import FoundryCBAgent
from azure.ai.agentserver.core.models import (
CreateResponse,
Response as OpenAIResponse,
)
from azure.ai.agentserver.core.models.projects import (
ItemContentOutputText,
ResponsesAssistantMessageItemResource,
ResponseTextDeltaEvent,
ResponseTextDoneEvent,
)
def stream_events(text: str):
assembled = ""
for i, token in enumerate(text.split(" ")):
piece = token if i == len(text.split(" ")) - 1 else token + " "
assembled += piece
yield ResponseTextDeltaEvent(delta=piece)
# Done with text
yield ResponseTextDoneEvent(text=assembled)
async def agent_run(request_body: CreateResponse):
agent = request_body.agent
print(f"agent:{agent}")
if request_body.stream:
return stream_events("I am mock agent with no intelligence in stream mode.")
# Build assistant output content
output_content = [
ItemContentOutputText(
text="I am mock agent with no intelligence.",
annotations=[],
)
]
response = OpenAIResponse(
metadata={},
temperature=0.0,
top_p=0.0,
user="me",
id="id",
created_at=datetime.datetime.now(),
output=[
ResponsesAssistantMessageItemResource(
status="completed",
content=output_content,
)
],
)
return response
my_agent = FoundryCBAgent()
my_agent.agent_run = agent_run
if __name__ == "__main__":
my_agent.run()
```
## Troubleshooting
First run your agent with azure-ai-agentserver-core locally.
If it works on local by failed on cloud. Check your logs in the application insight connected to your Azure AI Foundry Project.
### Reporting issues
To report an issue with the client library, or request additional features, please open a GitHub issue [here](https://github.com/Azure/azure-sdk-for-python/issues). Mention the package name "azure-ai-agents" in the title or content.
## Next steps
Please visit [Samples](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/agentserver/azure-ai-agentserver-core/samples) folder. There are several cases for you to build your agent with azure-ai-agentserver
## Contributing
This project welcomes contributions and suggestions. Most contributions require
you to agree to a Contributor License Agreement (CLA) declaring that you have
the right to, and actually do, grant us the rights to use your contribution.
For details, visit https://cla.microsoft.com.
When you submit a pull request, a CLA-bot will automatically determine whether
you need to provide a CLA and decorate the PR appropriately (e.g., label,
comment). Simply follow the instructions provided by the bot. You will only
need to do this once across all repos using our CLA.
This project has adopted the
[Microsoft Open Source Code of Conduct][code_of_conduct]. For more information,
see the Code of Conduct FAQ or contact opencode@microsoft.com with any
additional questions or comments.
|