Demo 1 — Getting Started (Create and Run Your First Foundry Agent)
Reference:
- https://learn.microsoft.com/en-us/agent-framework/tutorials/agents/run-agent?pivots=programming-language-python
- https://learn.microsoft.com/en-us/agent-framework/user-guide/agents/agent-types/azure-ai-foundry-agent?pivots=programming-language-python
Objectives (What You Will Learn in This Demo)
- Create and run a single agent using Agent Framework (Python)
- Use Microsoft Foundry Agents (
FoundryChatClient) as the backend and retrieve the result ofagent.run() - Experience the minimal unit of an agent: “Agent = LLM + Instructions + Execution API”
Note:
- Demos 2/3/5 that follow build on the same backend (Foundry Agents)
- Direct Azure OpenAI connection (
OpenAIChatCompletionClient) is a separate backend (covered in Demos 4/6 in this repository)
Prerequisites (Once These Are in Place, the Rest Is Just Copy-and-Paste)
A. Execution Environment
- Dev Containers / GitHub Codespaces (recommended)
- For Codespaces: just open the repository and click “Create codespace”
- For local Dev Containers: use “Reopen in Container” in VS Code
Even without Dev Containers, you can run the demo as long as you have Python 3.10+ and
pip.
B. Setting Up Microsoft Foundry Agents (One-Time Setup)
This Demo 1 uses Microsoft Foundry Project as its backend.
- Prepare a Hub / Project in Microsoft Foundry (an existing one is fine)
- Deploy a model under the Project’s Models + endpoints (e.g.,
gpt-4o-mini) - Grant RBAC permissions to the executing user (the account used for
az login) so they can run Agents on the Project / Hub
The two main things you need for this demo are:
- Project endpoint (
FOUNDRY_PROJECT_ENDPOINT) - Model deployment name (
FOUNDRY_MODEL)
C. Required Environment Variables (Set Using One of the Following Methods)
- Method 1: Codespaces Secrets (recommended — prevents key leakage)
- Method 2: Use
exportinside the Dev Container - Method 3:
.env(do NOT commit this file; provide a.env.exampleinstead)
Minimum required:
FOUNDRY_PROJECT_ENDPOINTFOUNDRY_MODEL
Example:
export FOUNDRY_PROJECT_ENDPOINT="https://<account>.services.ai.azure.com/api/projects/<project-id>"
export FOUNDRY_MODEL="gpt-4o-mini"
Common Pitfalls:
FOUNDRY_PROJECT_ENDPOINTmust be the Foundry Project endpoint (https://...services.ai.azure.com/api/projects/...).- This is different from the Azure OpenAI or Azure AI Services endpoint (
...cognitiveservices.azure.com)
- This is different from the Azure OpenAI or Azure AI Services endpoint (
FOUNDRY_MODELis the deployment name on the Foundry project side (not the model name)
Steps (Step-by-step)
Step 1. Install Dependencies
If you are using a Dev Container, the dependencies are most likely already installed. If not, run the following:
pip install agent-framework-foundry --pre
Step 2. Log In with Azure CLI (Entra ID Authentication)
Run this inside Codespaces / the container.
az login
In Codespaces, you may not be able to open a browser, so
--use-device-codeis convenient.az login --use-device-code
Authentication Method Used in This Demo (Important)
This demo uses Microsoft Entra ID (= Azure CLI credential).
Therefore, az login is required.
Step 3. Review the Script (src/demo1_run_agent.py)
This repository includes src/demo1_run_agent.py.
(You do not need to create it manually.)
(Note)
- The official documentation may show examples using
create_agent(...), but this repository usesas_agent(...)to match the pinned version (agent-framework-foundry>=1.2.2,<2.0).
Workarounds for Common Pitfalls Included in This Repository
- In Dev Container / Codespaces environments, environment variables may be injected as empty strings. In that case, a typical dotenv load will fail to populate the values properly.
- To address this,
src/demo1_run_agent.pyexplicitly loads the.envfile from the repository root and only fills in environment variables that are unset or empty.
Step 4. Run
python3 -u src/demo1_run_agent.py
Expected behavior:
venue_specialistproposes an event plan (venue, candidates, key points, etc.)result.textis displayed without errors
Technical Details (What Happens Behind the Scenes in This Demo)
1) Architecture Overview (Foundry / Local Code)
At a high level, this demo consists of the following three layers:
- Microsoft Foundry side
- Project (Hub/Project)
- Model deployment under Models + endpoints
- Environment variables (
.env/ Secrets / export)FOUNDRY_PROJECT_ENDPOINTFOUNDRY_MODEL- Authentication credentials (Entra ID:
az login)
- Application code (
src/demo1_run_agent.py)- Creates an
FoundryChatClient, converts it to an Agent withas_agent(), and callsrun()
- Creates an
2) FoundryChatClient Is a Client That Connects to Foundry Agents
FoundryChatClient is the client used to connect to and execute Agents on an Microsoft Foundry Project.
In this demo, the Project and Model are specified via environment variables, and agent.run() is called at runtime.
3) Observability (OpenTelemetry / OTel)
If OpenTelemetry is installed in the environment, the demo script displays Agent execution and Tool invocations as short one-line log entries (for observability during the demo).
4) run() Is a Single-Turn Execution
run() performs a single inference for a single input (user utterance) and returns the result as a ChatResponse.
result.text is the final result in human-readable text form.
Reference: Streaming Version (run_stream())
As described in the Learn tutorials, you can use run_stream() if you want streaming output:
async def main():
async for update in agent.run_stream("Tell me a joke about a pirate."):
if update.text:
print(update.text, end="", flush=True)
print()
Troubleshooting
Common Issue 1: Authentication Error (401/403)
- Verify that
az loginwas successful inside the container - Verify that the appropriate role has been assigned on the target Azure OpenAI resource
- If the subscription is different, align it with
az account set
Common Issue 2: FOUNDRY_PROJECT_ENDPOINT Is Incorrect / 404
FOUNDRY_PROJECT_ENDPOINT must be the Foundry Project endpoint.
- ✅ Example:
https://<account>.services.ai.azure.com/api/projects/<project-id> - ❌ Example:
https://<resource>.cognitiveservices.azure.com/(Azure OpenAI / Azure AI Services endpoint)
Common Issue 3: Failed to resolve model info (Wrong Deployment Name)
FOUNDRY_MODEL cannot be resolved in the Foundry project.
Check:
- Foundry portal → target project → verify the deployment name exists under Models + endpoints
Common Issue 4: DNS Resolution Fails and the Process Stalls Before Starting
The host in FOUNDRY_PROJECT_ENDPOINT cannot be resolved via DNS from this execution environment.
- Resolution: Review your private networking / private DNS configuration, or run from a network where DNS resolution is available
Next Demo
In Demo 2, we will add a tool (Hosted Web Search) to the agent.
→ Open demo2.md to continue.