Series note — This is chapter 0 of MAF v1: Python and .NET. The original Python-only series at Building a Multi-Agent E-Commerce Platform assumed you had the Python stack ready. This series covers both languages, so you need a slightly bigger tool belt.
Repo — All runnable code for the series lives at github.com/nitin27may/e-commerce-agents. Clone it once and keep it open; every chapter folder under
tutorials/is self-contained.
Why this chapter#
The rest of the series runs real code against real LLMs. You need one Python toolchain, one .NET toolchain, Docker for the supporting infra, and a key for either OpenAI or Azure OpenAI. Do this once and forget it.
If you have never heard of the Microsoft Agent Framework before, read the next section first — it will make the installation steps feel less arbitrary. If you’re already running MAF locally, skip straight to Step 1.
What is the Microsoft Agent Framework?#
The Microsoft Agent Framework (MAF) is Microsoft’s SDK for building LLM-powered agents and multi-agent workflows. It ships in two flavours that share the same abstractions:
- Python — the
agent-frameworkpackage (plusagent-framework-a2a,agent-framework-openai, and friends). - .NET — the
Microsoft.Agents.AIfamily of NuGet packages (built onMicrosoft.Extensions.AI).
Under the hood MAF wraps three moving parts. An agent is an LLM (called a chat client in MAF) plus instructions, tools, and optional middleware. A session is how MAF remembers a multi-turn conversation. A workflow is a graph of executors you use when an LLM-driven agent is too free-form and you want deterministic orchestration instead.
This tutorial series builds a full e-commerce platform on top of those three primitives. The stack you’re installing in this chapter is a superset of what any individual chapter needs — we’re over-provisioning so later chapters don’t ask you to install anything new.
agent-framework"] net[".NET 9 SDK
Microsoft.Agents.AI"] end subgraph infra_stack["Docker Compose (local infra)"] pg[(Postgres 16
+ pgvector)] redis[(Redis 7)] aspire[Aspire Dashboard
:18888] end subgraph web["Frontend"] next["Next.js 16
Node 20 + pnpm"] end subgraph cloud["LLM provider (pick one)"] openai(["OpenAI
gpt-4.1"]) azure(["Azure OpenAI
gpt-4.1 deployment"]) end py --> pg py --> redis py --> openai py --> azure net --> pg net --> redis net --> openai net --> azure next --> py next --> net py -. OTel .-> aspire net -. OTel .-> aspire class py,net core class openai,azure external class pg,redis,aspire infra class next success
You install the blue pieces (language toolchains) and the grey pieces (Docker + infra) locally. You bring your own orange piece (one LLM provider key). The green piece is Next.js, which talks to whichever backend you run.
Prerequisites#
You need a Unix-like shell. macOS and Linux work out of the box; on Windows, use WSL2. Everything below assumes bash or zsh.
If you plan to follow the official Microsoft tutorials alongside this series, keep the MAF docs tabs open:
Step 1 — Install the toolchains#
uv (Python package manager)#
curl -LsSf https://astral.sh/uv/install.sh | shVerify and install Python 3.12:
uv --version # expect: uv 0.5.x or later
uv python install 3.12Why uv and not pip / poetry — it resolves and installs 10–100× faster, manages the virtualenv for you, and is what every tutorial in this repo pins. Treat uv as the single entry point for Python deps.
.NET 9 SDK#
| Platform | Install |
|---|---|
| macOS | brew install --cask dotnet-sdk |
| Ubuntu | Microsoft’s official instructions |
| Windows | dotnet.microsoft.com/download |
Verify:
dotnet --list-sdks # expect: 9.0.x or laterDocker + Compose v2#
Install Docker Desktop (macOS / Windows) or Docker Engine + docker-compose-plugin on Linux. docker compose version must print a v2.x string.
Node 20 + pnpm (for the frontend)#
# If you don't have Node:
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.40.1/install.sh | bash
nvm install 20
# pnpm:
corepack enable pnpmStep 2 — Clone and configure#
git clone https://github.com/nitin27may/e-commerce-agents.git
cd e-commerce-agents
cp .env.example .envOpen .env and pick one provider. You only need one key to follow the series; the code reads LLM_PROVIDER and switches automatically.
Option A — OpenAI#
LLM_PROVIDER=openai
OPENAI_API_KEY=sk-... # from platform.openai.com/api-keys
LLM_MODEL=gpt-4.1
EMBEDDING_MODEL=text-embedding-3-smallGet a key at platform.openai.com/api-keys. The series assumes a model with tool-calling support — gpt-4.1, gpt-4o, or gpt-4o-mini all work.
Option B — Azure OpenAI#
LLM_PROVIDER=azure
AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com/
AZURE_OPENAI_KEY=...
AZURE_OPENAI_DEPLOYMENT=gpt-4.1
AZURE_OPENAI_API_VERSION=2025-03-01-preview
AZURE_EMBEDDING_DEPLOYMENT=text-embedding-3-smallOn Azure OpenAI, AZURE_OPENAI_DEPLOYMENT is the deployment name you created in the portal — not the model family. If your deployment is called gpt-4.1-prod, that’s what goes here.
Environment variable reference#
| Variable | Purpose | Required when | Default |
|---|---|---|---|
LLM_PROVIDER | Selects OpenAI or Azure OpenAI code paths | Always | openai |
LLM_MODEL | Model name (OpenAI only) | LLM_PROVIDER=openai | gpt-4.1 |
OPENAI_API_KEY | OpenAI secret key | LLM_PROVIDER=openai | — |
EMBEDDING_MODEL | Embedding model (OpenAI only) | Embeddings-aware chapters | text-embedding-3-small |
AZURE_OPENAI_ENDPOINT | Azure resource URL | LLM_PROVIDER=azure | — |
AZURE_OPENAI_KEY | Azure resource key | LLM_PROVIDER=azure | — |
AZURE_OPENAI_DEPLOYMENT | Chat deployment name | LLM_PROVIDER=azure | — |
AZURE_OPENAI_API_VERSION | Azure REST API version | LLM_PROVIDER=azure | 2025-03-01-preview |
AZURE_EMBEDDING_DEPLOYMENT | Embedding deployment name | Embeddings-aware chapters | — |
JWT_SECRET | Signs user JWTs (capstone auth) | Capstone only | dev default |
AGENT_SHARED_SECRET | Inter-agent x-agent-secret header | Capstone only | dev default |
Keep JWT_SECRET and AGENT_SHARED_SECRET at their defaults for local dev. They’re rotated in production only.
Step 3 — Verify#
One script checks everything at once:
./scripts/verify-setup.shExpected output:
Tooling
✓ uv (Python package manager)
✓ Python 3.12+
✓ .NET SDK 9+
✓ Docker
✓ Docker Compose v2
...
Summary
All 15 checks passed.If any check shows ✗, fix that item and re-run — the script is idempotent. See Troubleshooting below for common failures.
Step 4 — Run something#
Bring up the Python stack:
./scripts/dev.sh- Frontend: http://localhost:3000
- Orchestrator API: http://localhost:8080
- Aspire Dashboard (telemetry): http://localhost:18888
Log in with any seeded test user (see the repo root README.md) and try a prompt like “show me running shoes under $100”.
To try the .NET stack (as each chapter’s dotnet/ side lands):
docker compose -f docker-compose.dotnet.yml --profile agents up --buildBoth compose files expose the backend on :8080, so you only run one backend at a time.
Side-by-side differences#
This chapter is pure tooling — no agent code yet. Two cross-cutting differences are worth flagging before Chapter 01:
| Concern | Python | .NET |
|---|---|---|
| Env-var loading | pydantic-settings reads .env automatically | ASP.NET Core reads env vars + launchSettings.json; .env needs a small loader (shipped in ECommerceAgents.Shared) |
| Package management | uv sync (per-project lockfile) | dotnet restore with central package management via Directory.Packages.props |
| Runtime | Python 3.12 — single interpreter per service | .NET 9 — one runtime per service, can run self-contained |
| LLM SDK | openai + azure-openai, wrapped by agent-framework-openai | Azure.AI.OpenAI + Microsoft.Extensions.AI.OpenAI, wrapped by Microsoft.Agents.AI |
Troubleshooting#
The most common verify-setup.sh failures and what to do about them:
| Symptom | Likely cause | Fix |
|---|---|---|
✗ uv (Python package manager) | uv binary not on PATH | Restart your shell, or add ~/.local/bin to PATH. |
✗ Python 3.12+ | uv installed but no 3.12 toolchain | uv python install 3.12 then re-run. |
✗ .NET SDK 9+ | Only older SDKs installed | dotnet --list-sdks — if nothing ≥ 9.0, reinstall from the vendor link above. |
✗ Docker Compose v2 | Standalone docker-compose (v1) installed | Use docker compose (no hyphen). Install docker-compose-plugin on Linux. |
✗ OPENAI_API_KEY | Placeholder still in .env | The script rejects sk-your-openai-api-key-here. Paste your real key. |
Requests fail with 404 on Azure | Deployment name mismatch | AZURE_OPENAI_DEPLOYMENT must equal the portal deployment name exactly, not the model family. |
docker compose freezes on macOS | Docker Desktop stalled | Quit Docker Desktop → open again, or reinstall if it won’t start. |
bind: address already in use on 5432 / 6379 / 8080 | Existing local Postgres / Redis / other service | Stop the host service, or change ports in docker-compose.yml. |
./scripts/dev.sh: Permission denied | Scripts not executable after clone | chmod +x scripts/*.sh |
Gotchas#
- Azure deployment name vs model family. Portal shows
gpt-4.1-prod→ putgpt-4.1-prodin.env, notgpt-4.1. This is the #1 Azure pitfall. verify-setup.shrejects the placeholder.OPENAI_API_KEY=sk-your-openai-api-key-hereis a sentinel. Replace with your real key.- WSL2 + Docker Desktop networking. On Windows, the containers bind to WSL2’s localhost; confirm
localhost:3000reaches it. - Cost awareness. Every request hits a paid LLM.
gpt-4.1is cheap per-request, but tight loops (e.g. streaming tests) add up. Usegpt-4o-minifor iteration if you want to keep the meter low.
Tests#
This chapter’s “test” is the verify script. Run it in CI to catch toolchain regressions:
./scripts/verify-setup.sh
echo "exit code: $?" # 0 if all checks passThe script exits non-zero on the first failure, making it usable as a gate in GitHub Actions.
How this shows up in the capstone#
verify-setup.sh lives alongside scripts/dev.sh and is linked from the repo root README.md. It is the single source of truth for “my environment is ready” — both for newcomers on their first clone and for CI.
Further reading & links#
This chapter
- Source on GitHub: tutorials/00-setup
- Next: Chapter 01 — Your First Agent
Microsoft Agent Framework docs
- Overview
- Get Started
- Agents — Providers (OpenAI vs Azure OpenAI)
Supporting tools
Series shared resources
- Mermaid style guide — the palette all diagrams in this series use.
- Jargon glossary — one-line definitions for every MAF term.
What’s next#
Chapter 01 — Your First Agent takes the stack you just installed and builds the smallest possible MAF agent — about 40 lines per language — to prove everything is wired up.

