This sample shows how to use an OpenRouter-hosted model with ADK through the existing LiteLLM model connector.
Install ADK with optional integrations so that LiteLlm is available:
pip install "google-adk[extensions]"Set your OpenRouter API key:
export OPENROUTER_API_KEY="..."Optionally choose a model:
export OPENROUTER_MODEL="openrouter/openai/gpt-5.2"Run the sample:
adk run contributing/samples/hello_world_openrouterOpenRouter is used here through LiteLLM's OpenAI-compatible routing path:
LiteLlm(
model="openrouter/openai/gpt-5.2",
api_key=os.getenv("OPENROUTER_API_KEY"),
api_base="https://openrouter.ai/api/v1",
)For Gemini models routed through OpenRouter, use OpenRouter model IDs such as
openrouter/google/gemini-2.5-pro:online. ADK's built-in Google tools are
optimized for native Gemini model connections, so verify tool compatibility for
the routed model and provider you select.