@@ -12,8 +12,8 @@ params:
1212> [ !TIP]
1313>
1414> This guide uses the familiar Docker Compose workflow to orchestrate agentic AI
15- > applications. For a smoother development experience, check out [ Docker
16- > Docker Agent] ( ../manuals/ai/docker-agent/_index.md ) , a purpose-built agent runtime that
15+ > applications. For a smoother development experience, check out
16+ > [ Docker Agent] ( ../manuals/ai/docker-agent/_index.md ) , a purpose-built agent runtime that
1717> simplifies running and managing AI agents.
1818
1919## Introduction
@@ -65,11 +65,11 @@ all works together.
6565
6666To follow this guide, you need to:
6767
68- - [ Install Docker Desktop 4.43 or later] ( ../get-started/get-docker.md )
69- - [ Enable Docker Model Runner] ( /manuals/ai/model-runner.md#enable-dmr-in-docker-desktop )
70- - At least the following hardware specifications:
71- - VRAM: 3.5 GB
72- - Storage: 2.31 GB
68+ - [ Install Docker Desktop 4.43 or later] ( ../get-started/get-docker.md )
69+ - [ Enable Docker Model Runner] ( /manuals/ai/model-runner.md#enable-dmr-in-docker-desktop )
70+ - At least the following hardware specifications:
71+ - VRAM: 3.5 GB
72+ - Storage: 2.31 GB
7373
7474## Step 1: Clone the sample application
7575
@@ -90,8 +90,9 @@ run in the cloud. This particular example uses the [Gemma 3 4B
9090model] ( https://hub.docker.com/r/ai/gemma3 ) with a context size of ` 10000 ` .
9191
9292Hardware requirements:
93- - VRAM: 3.5 GB
94- - Storage: 2.31 GB
93+
94+ - VRAM: 3.5 GB
95+ - Storage: 2.31 GB
9596
9697If your machine exceeds those requirements, consider running the application with a larger
9798context size or a larger model to improve the agents performance. You can easily
@@ -113,7 +114,7 @@ To run the application locally, follow these steps:
113114 incorrect fact in the prompt and hit enter. An agent searches DuckDuckGo to
114115 verify it and another agent revises the output.
115116
116- ![ Screenshot of the application] ( ./images/agentic-ai-app.png )
117+ ![ Screenshot of the application] ( ./images/agentic-ai-app.png )
117118
1181193 . Press ctrl-c in the terminal to stop the application when you're done.
119120
@@ -136,7 +137,7 @@ services:
136137 depends_on :
137138 - mcp-gateway
138139 models :
139- gemma3 :
140+ gemma3 :
140141 endpoint_var : MODEL_RUNNER_URL
141142 model_var : MODEL_RUNNER_MODEL
142143
@@ -160,16 +161,16 @@ models:
160161
161162The app consists of three main components:
162163
163- - The ` adk ` service, which is the web application that runs the agentic AI
164- application. This service talks to the MCP gateway and model.
165- - The ` mcp-gateway ` service, which is the MCP gateway that connects the app
166- to external tools and services.
167- - The ` models ` block, which defines the model to use with the application.
164+ - The ` adk ` service, which is the web application that runs the agentic AI
165+ application. This service talks to the MCP gateway and model.
166+ - The ` mcp-gateway ` service, which is the MCP gateway that connects the app
167+ to external tools and services.
168+ - The ` models ` block, which defines the model to use with the application.
168169
169170When you examine the ` compose.yaml ` file, you'll notice two notable elements for the model:
170171
171- - A service‑level ` models ` block in the ` adk ` service
172- - A top-level ` models ` block
172+ - A service‑level ` models ` block in the ` adk ` service
173+ - A top-level ` models ` block
173174
174175These two blocks together let Docker Compose automatically start and connect
175176your ADK web app to the specified LLM.
@@ -189,7 +190,7 @@ example, it uses the [`duckduckgo` MCP
189190server] ( https://hub.docker.com/mcp/server/duckduckgo/overview ) to perform web
190191searches.
191192
192- > [ !TIP]
193+ > [ !TIP]
193194>
194195> Looking for more MCP servers to use? Check out the [ Docker MCP
195196> Catalog] ( https://hub.docker.com/catalogs/mcp/ ) .
0 commit comments