Skip to content

Commit 1bd8b7f

Browse files
committed
feat(models): add OCI Generative AI provider for Google Gemini on OCI
Adds first-class support for Google Gemini models hosted on Oracle Cloud Infrastructure (OCI) Generative AI service — a native Google × OCI model partnership that makes Gemini available directly through OCI's inference endpoints. Key design points: - Subclasses BaseLlm following the anthropic_llm.py pattern - Uses the OCI Python SDK directly (no LangChain dependency) - Optional dependency: pip install google-adk[oci] - Supports API_KEY, INSTANCE_PRINCIPAL, and RESOURCE_PRINCIPAL auth - Both non-streaming (_call_oci) and streaming (_call_oci_stream) paths share setup code via _build_chat_details(); streaming collects OCI's OpenAI-compatible SSE events in a thread pool (asyncio.to_thread) and yields partial then final LlmResponse - Registers google.gemini-* (and other OCI-hosted) model patterns in LLMRegistry via optional try/except in models/__init__.py - 37 unit tests (fully mocked, no OCI account needed) - 10 integration tests (skipped when OCI_COMPARTMENT_ID is unset) Supported models: google.gemini-*, google.gemma-*, meta.llama-*, mistralai.*, xai.grok-*, nvidia.*
1 parent c87ee1e commit 1bd8b7f

5 files changed

Lines changed: 1600 additions & 0 deletions

File tree

pyproject.toml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -130,6 +130,7 @@ optional-dependencies.extensions = [
130130
"pypika>=0.50", # For crewai->chromadb dependency
131131
"toolbox-adk>=1,<2", # For tools.toolbox_toolset.ToolboxToolset
132132
]
133+
optional-dependencies.oci = [ "oci>=2.170.0" ] # For OCI Generative AI model support
133134
optional-dependencies.otel-gcp = [ "opentelemetry-instrumentation-google-genai>=0.6b0,<1" ]
134135
optional-dependencies.slack = [ "slack-bolt>=1.22" ]
135136
optional-dependencies.test = [

src/google/adk/models/__init__.py

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -31,6 +31,7 @@
3131
from .gemma_llm import Gemma3Ollama
3232
from .google_llm import Gemini
3333
from .lite_llm import LiteLlm
34+
from .oci_genai_llm import OCIGenAILlm
3435

3536
__all__ = [
3637
'ApigeeLlm',
@@ -41,6 +42,7 @@
4142
'Gemma3Ollama',
4243
'LLMRegistry',
4344
'LiteLlm',
45+
'OCIGenAILlm',
4446
]
4547

4648
_LAZY_PROVIDERS: dict[str, tuple[list[str], str]] = {
@@ -78,6 +80,18 @@
7880
],
7981
'lite_llm',
8082
),
83+
'OCIGenAILlm': (
84+
[
85+
r'meta\.llama-.*',
86+
r'google\.gemini-.*',
87+
r'google\.gemma-.*',
88+
r'xai\.grok-.*',
89+
r'mistralai\.mistral-.*',
90+
r'mistralai\.mixtral-.*',
91+
r'nvidia\..*',
92+
],
93+
'oci_genai_llm',
94+
),
8195
}
8296

8397
for _name, (_patterns, _module) in _LAZY_PROVIDERS.items():

0 commit comments

Comments
 (0)