Skip to content

Commit 6a3d176

Browse files
committed
Initial commit of Getting Started Guide
1 parent a187a62 commit 6a3d176

5 files changed

Lines changed: 487 additions & 0 deletions

File tree

Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,32 @@
1+
Copyright (c) 2026 Oracle and/or its affiliates.
2+
3+
The Universal Permissive License (UPL), Version 1.0
4+
5+
Subject to the condition set forth below, permission is hereby granted to any
6+
person obtaining a copy of this software, associated documentation and/or data
7+
(collectively the "Software"), free of charge and under any and all copyright
8+
rights in the Software, and any and all patent rights owned or freely
9+
licensable by each licensor hereunder covering either (i) the unmodified
10+
Software as contributed to or provided by such licensor, or (ii) the Larger
11+
Works (as defined below), to deal in both
12+
13+
(a) the Software, and
14+
15+
(b) any piece of software and/or hardware listed in the lrg_coverage.txt file,
16+
if one is included with the Software (each a "Covered Work", collectively
17+
"Covered Works"), that is available under a license listed in the
18+
lrg_coverage.txt file, if one is included with the Software (the "Covered
19+
License"), and
20+
21+
without limitation, the right to copy, create derivative works of, display,
22+
perform, and distribute the Software and make, use, sell, offer for sale,
23+
import, export, have made, and have sold the Software and the Covered Works,
24+
in each case subject to the conditions and limitations set forth herein.
25+
26+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
27+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
28+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
29+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
30+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
31+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
32+
SOFTWARE.
Lines changed: 79 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,79 @@
1+
# Getting Started with OCI Generative AI Service — Developer Onboarding Guide
2+
3+
*A collection of step-by-step guides for developers onboarding to Oracle Cloud Infrastructure, covering how to connect applications to the OCI Generative AI Service using the OCI SDK, LangChain, and low-code platforms like n8n.*
4+
5+
Author: Dejan Vlasakov
6+
7+
Reviewed: 12.02.2026
8+
9+
# When to use this asset?
10+
11+
*Use this asset when onboarding developers to OCI Generative AI Service. It serves as both a walkthrough for the onboarding team and a reference handout for developers.*
12+
13+
### Who
14+
- Developer onboarding teams introducing OCI Generative AI to new users
15+
- Developers connecting applications to OCI Generative AI for the first time
16+
- Solution architects evaluating integration patterns (SDK, LangChain, low-code)
17+
18+
### When
19+
- Setting up a new project that calls OCI Generative AI (chat, embeddings, or RAG)
20+
- Integrating LangChain with OCI-hosted models for AI application development
21+
- Connecting low-code / no-code platforms (n8n, etc.) to OCI Generative AI via an OpenAI-compatible gateway
22+
- Onboarding workshops and enablement sessions
23+
24+
# How to use this asset?
25+
26+
*Pick the guide that matches your integration approach. Each guide includes prerequisites, working code examples, and environment configuration.*
27+
28+
### Guide 1 — Oracle Database 26ai + OCI Generative AI (OCI SDK)
29+
30+
End-to-end RAG pattern: create an OCI GenAI inference client, generate embeddings, perform vector similarity search in Oracle Database 26ai, and call the chat API with grounded context — all using the OCI Python SDK directly.
31+
32+
**[Read the guide →](files/OracleDB-GenAIOCI-Connection.md)**
33+
34+
### Guide 2 — LangChain + OCI Generative AI (`langchain-oci`)
35+
36+
Set up the official `langchain-oci` package, instantiate chat and embedding models, and build prompt chains — using the dedicated LangChain integration for OCI.
37+
38+
**[Read the guide →](files/LangChainOCI-GenAIOCI-Connection.md)**
39+
40+
### Guide 3 — n8n + OCI Generative AI (OpenAI-compatible gateway)
41+
42+
Deploy an OpenAI-compatible gateway that proxies requests to OCI GenAI, configure n8n to use it, and build a sample AI-powered summarisation workflow — no custom code required.
43+
44+
**[Read the guide →](files/N8N-GenAIOCI-Connection.md)**
45+
46+
### Common Prerequisites
47+
48+
All three guides share the same foundational setup:
49+
50+
1. An active OCI tenancy with Generative AI enabled
51+
2. IAM policies granting access to `generative-ai-family`
52+
3. OCI API key authentication configured locally (`~/.oci/config`)
53+
54+
Each guide then adds tool-specific requirements (Python packages, LangChain, n8n, etc.).
55+
56+
### File Structure
57+
```
58+
.
59+
├── README.md # This file
60+
├── LICENSE
61+
└── files/
62+
├── OracleDB-GenAIOCI-Connection.md # Guide 1 – OCI SDK + Oracle DB 26ai
63+
├── LangChainOCI-GenAIOCI-Connection.md # Guide 2 – LangChain integration
64+
└── N8N-GenAIOCI-Connection.md # Guide 3 – n8n / low-code integration
65+
```
66+
67+
# Useful Links
68+
69+
- [OCI Generative AI Documentation](https://docs.oracle.com/en-us/iaas/Content/generative-ai/home.htm)
70+
- [OCI Generative AI — LangChain Integration](https://docs.oracle.com/en-us/iaas/Content/generative-ai/langchain.htm)
71+
- [OCI IAM Policies — Getting Started](https://docs.oracle.com/en-us/iaas/Content/Identity/Concepts/policygetstarted.htm)
72+
- [OCI API Key Authentication Setup](https://docs.oracle.com/en-us/iaas/Content/generative-ai/setup-oci-api-auth.htm)
73+
74+
# License
75+
76+
Copyright (c) 2026 Oracle and/or its affiliates.
77+
Licensed under the Universal Permissive License (UPL), Version 1.0.
78+
79+
See [LICENSE](LICENSE) for more details.
Lines changed: 101 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,101 @@
1+
## Integrating LangChain with OCI Generative AI
2+
We're excited to announce the official release of `langchain-oci`, the dedicated LangChain integration package for Oracle Cloud Infrastructure services.
3+
4+
Install the LangChain OCI package with the following command:
5+
6+
`pip install -U langchain-oci`
7+
8+
**Note:** The OCI integrations that were available in the `langchain-community` are now deprecated. All future development, bug fixes, and feature enhancements will be hosted in the new dedicated repository. We recommend that you migrate to `langchain-oci` to ensure that you receive the following benefits:
9+
10+
* Latest features and improvements
11+
* Security updates and bug fixes
12+
* Dedicated support and documentation
13+
* Performance optimizations
14+
15+
To learn about OCI Generative AI's integration with LangChain, see the [Generative AI documentation](https://docs.oracle.com/iaas/Content/generative-ai/langchain.htm).
16+
17+
### Prerequisites
18+
1. **OCI Account and Permissions:**
19+
* Create or use an active OCI tenancy with Generative AI enabled.
20+
* Set up IAM policies: Grant your user/group access to `generative-ai-family` in a sandbox compartment (e.g., `allow group <group-name> to manage generative-ai-family in compartment <compartment-name>`).
21+
* Gather: Compartment OCID (from OCI Console > Identity & Security > Compartments).
22+
* Docs: [OCI Generative AI Getting Started](https://docs.oracle.com/en-us/iaas/Content/generative-ai/home.htm) and [IAM Policies](https://docs.oracle.com/en-us/iaas/Content/Identity/Concepts/policygetstarted.htm).
23+
2. **OCI CLI and SDK Installation:**
24+
* Install the OCI CLI (includes Python SDK for authentication).
25+
* Command: `pip install oci-cli`.
26+
* Verify: `oci --version`.
27+
* Docs: [Install OCI CLI](https://docs.oracle.com/en-us/iaas/Content/API/SDKDocs/cliinstall.htm).
28+
3. **Python Packages:**
29+
* Install core libraries:
30+
```bash
31+
pip install langchain langchain-oci oci
32+
```
33+
* `langchain`: Core framework.
34+
* `langchain-oci`: Support for the OCI chat and embedding models.
35+
* `oci`: SDK for auth and API calls.
36+
4. **[Set up OCI API key](https://docs.oracle.com/en-us/iaas/Content/generative-ai/setup-oci-api-auth.htm) authentication locally.**
37+
38+
## Connecting LangChain to OCI Generative AI
39+
- Using a Chat Model: `ChatOCIGenAI` class exposes chat models from OCI Generative AI
40+
- Using a completion Model: `OCIGenAI` class exposes LLMs from OCI Generative AI
41+
- Using an embeddings Model: `OCIGenAIEmbeddings` class exposes embeddings from OCI Generative AI
42+
43+
### 1. Basic LLM Instantiation and Invocation
44+
Start with a simple call to an on-demand model using API Key auth (default). This invokes the LLM directly.
45+
```python
46+
from langchain_oci import OCIGenAI
47+
48+
# Initialize the OCI GenAI LLM interface
49+
llm = OCIGenAI(
50+
model_id="cohere.command", # On-demand model ID
51+
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
52+
compartment_id="MY_OCID", # Your compartment OCID
53+
# Auth uses ~/.oci/config by default (API Key)
54+
)
55+
56+
# Basic invocation with optional parameters
57+
response = llm.invoke("Tell me one fact about Earth", temperature=0.7)
58+
print(response)
59+
# Expected output: A generated fact, e.g., "Earth is the third planet from the Sun."
60+
```
61+
62+
### 2. Prompt Chaining with LLMChain
63+
Chain a prompt template to the LLM for structured inputs/outputs. This example uses Session Token auth and passes model kwargs for consistency.
64+
```python
65+
from langchain_oci import OCIGenAI
66+
from langchain.chains import LLMChain
67+
from langchain_core.prompts import PromptTemplate
68+
69+
# Initialize with Session Token auth and model parameters
70+
llm = OCIGenAI(
71+
model_id="meta.llama-3.3-70b-instruct", # Another on-demand model
72+
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
73+
compartment_id="MY_OCID",
74+
auth_type="SECURITY_TOKEN",
75+
auth_profile="MY_PROFILE", # Profile from ~/.oci/config
76+
model_kwargs={
77+
"temperature": 0.7,
78+
"top_p": 0.75, # Nucleus sampling
79+
"max_tokens": 200 # Limit response length
80+
}
81+
)
82+
83+
# Define a prompt template
84+
prompt = PromptTemplate(
85+
input_variables=["query"],
86+
template="{query}" # Simple template; customize for multi-step chaining
87+
)
88+
89+
# Create the chain
90+
llm_chain = LLMChain(llm=llm, prompt=prompt)
91+
92+
# Invoke the chain
93+
response = llm_chain.invoke("What is the capital of France?")
94+
print(response["text"])
95+
# Expected output: "The capital of France is Paris."
96+
```
97+
**Resources:**
98+
99+
* [Full OCI GenAI + LangChain Guide](https://blogs.oracle.com/ai-and-datascience/developing-ai-apps-oci-generative-ai-langchain).
100+
* [LangChain OCI Integration Docs](https://docs.oracle.com/en-us/iaas/Content/generative-ai/langchain.htm).
101+
Lines changed: 129 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,129 @@
1+
## Integrating n8n Workflows with OCI Generative AI
2+
3+
### Prerequisites
4+
1. **OCI Account and Permissions:**
5+
* Create or use an active OCI tenancy with Generative AI enabled.
6+
* Set up IAM policies: Grant your user/group access to `generative-ai-family` in a sandbox compartment (e.g., `allow group <group-name> to manage generative-ai-family in compartment <compartment-name>`).
7+
* Gather: Compartment OCID (from OCI Console > Identity & Security > Compartments).
8+
* Docs: [OCI Generative AI Getting Started](https://docs.oracle.com/en-us/iaas/Content/generative-ai/home.htm) and [IAM Policies](https://docs.oracle.com/en-us/iaas/Content/Identity/Concepts/policygetstarted.htm).
9+
2. **[Set up OCI API key](https://docs.oracle.com/en-us/iaas/Content/generative-ai/setup-oci-api-auth.htm) authentication locally.**
10+
3. n8n installed and running (self-hosted or cloud version; version 1.0+ recommended for AI nodes).
11+
12+
## Step 1: Launch the OCI GenAI Gateway
13+
The gateway runs a local server (port 8088) that mimics the OpenAI API, allowing n8n to interact with OCI models like those provided by OpenAI, Llama and Grok as an example.
14+
15+
### Option 1: Run with Uvicorn (Local Development)
16+
From the repo root:
17+
18+
1. Navigate to `./app` and install dependencies: `pip install -r requirements.txt`.
19+
2. Start the server:
20+
```bash
21+
cd app
22+
uvicorn app:app --host 0.0.0.0 --port 8088 --reload
23+
```
24+
For production (Linux only, with Gunicorn for scaling):
25+
```bash
26+
gunicorn app:app --workers 16 --worker-class uvicorn.workers.UvicornWorker --timeout 600 --bind 0.0.0.0:8088
27+
```
28+
29+
### Option 2: Run with Podman (Containerized Deployment)
30+
31+
1. Install Podman:
32+
* Linux: `sudo apt install podman` (Ubuntu) or `sudo dnf install podman` (Fedora).
33+
* macOS: `brew install podman`.
34+
* Windows: Follow [Podman for Windows guide](https://podman.io/docs/installation#windows).
35+
2. Ensure `~/.oci/config` is set up (from prerequisites).
36+
3. Build and run:
37+
```bash
38+
podman build -t oci_genai_gateway .
39+
podman run -p 8088:8088 \
40+
-v ~/.oci:/root/.oci:Z \
41+
-it --name oci_genai_gateway oci_genai_gateway
42+
```
43+
4. Verify: Open `http://localhost:8088` in a browser (should show a health check or API docs). Check logs with `podman logs oci_genai_gateway`.
44+
45+
**Gateway Endpoint:** The server exposes `/v1/chat/completions` (OpenAI-compatible) at `http://localhost:8088`.
46+
47+
## Step 2: Configure n8n to Use the OCI Gateway
48+
In n8n, use the **OpenAI** node but point it to your local gateway as a custom endpoint. This lets you select OCI models seamlessly.
49+
50+
1. In n8n, create a new workflow.
51+
2. Add an **OpenAI** node (under AI > Chat Models).
52+
3. Configure credentials:
53+
* **API Key:** Leave blank or use a dummy value (gateway uses OCI auth).
54+
* **Base URL:** `http://host.docker.internal:8088/v1` (for Podman/Docker; use `http://localhost:8088/v1` if running natively).
55+
* **Model:** Select or enter an OCI model (list available models via gateway docs or OCI Console).
56+
n8n will now route requests through the gateway to OCI GenAI.
57+
58+
## Step 3: Simple n8n Workflow Example – AI-Powered Text Summarization
59+
This example creates a workflow that triggers on a webhook (e.g., incoming email or form data), summarizes text using an OCI LLM, and sends the result via email. It demonstrates calling the LLM with minimal code.
60+
61+
### Workflow Overview
62+
* **Trigger:** Webhook (receives input text).
63+
* **AI Node:** OpenAI (calls OCI via gateway for summarization).
64+
* **Output:** Email node (sends summary).
65+
66+
### Step-by-Step Setup in n8n
67+
68+
1. **Add Webhook Trigger:**
69+
Drag Webhook node.
70+
* Set Method: POST.
71+
* Path: `/summarize` (test URL: `http://your-n8n-instance/webhook/summarize`).
72+
* This receives JSON payload like `{"text": "Long article content here..."}`.
73+
2. **Add OpenAI Node for Summarization:**
74+
* Connect after Webhook.
75+
* Operation: Chat.
76+
* Credentials: Use the custom setup from Step 2 (Base URL: `http://localhost:8088/v1`).
77+
* Model: `meta.llama-3.3-70b-instruct` (or your preferred OCI model).
78+
* Messages:
79+
* Role: System – Prompt: `You are a helpful summarizer. Provide a concise 3-sentence summary.`
80+
* Role: User – Prompt: `{{ $json.text }}` (references webhook input).
81+
* Options: Temperature: 0.3 (for consistent outputs); Max Tokens: 200.
82+
**Effective API Call (Behind the Scenes):**
83+
The node generates a request like this to the gateway:
84+
```json
85+
POST http://localhost:8088/v1/chat/completions
86+
{
87+
"model": "meta.llama-3.3-70b-instruct",
88+
"messages": [
89+
{"role": "system", "content": "You are a helpful summarizer. Provide a concise 3-sentence summary."},
90+
{"role": "user", "content": "{{input_text}}"}
91+
],
92+
"temperature": 0.3,
93+
"max_tokens": 200
94+
}
95+
```
96+
The gateway forwards this to OCI GenAI, returning a response like:
97+
```json
98+
{
99+
"choices": [
100+
{
101+
"message": {
102+
"role": "assistant",
103+
"content": "Summary sentence 1. Sentence 2. Sentence 3."
104+
}
105+
}
106+
]
107+
}
108+
```
109+
3. **Add Email Output Node:**
110+
* Connect after OpenAI.
111+
* Node: Send Email (or Gmail/SMTP).
112+
* To: `recipient@example.com`.
113+
* Subject: `AI Summary of Input`.
114+
* Body: `{{ $json.choices[0].message.content }}` (extracts summary from AI response).
115+
4. **Activate and Test:**
116+
* Save and activate the workflow.
117+
* Send a POST request to the webhook (e.g., via curl):
118+
```bash
119+
curl -X POST http://your-n8n-instance/webhook/summarize \
120+
-H "Content-Type: application/json" \
121+
-d '{"text": "Oracle OCI GenAI enables powerful automations. n8n connects apps seamlessly. Together, they boost productivity."}'
122+
```
123+
* Check n8n executions: The workflow should summarize the text and email it.
124+
* View logs in n8n for details (e.g., AI response parsing).
125+
126+
## Troubleshooting & Tips
127+
* **Authentication Errors:** Verify `~/.oci/config` permissions and OCI policies (from prerequisites). Test with `oci` CLI commands like `oci os ns get`.
128+
* **Connection Issues:** Ensure port 8088 is open (firewall/OCI security lists). For Podman, use `--network=host` if needed.
129+
* **Model Not Found:** List OCI models in Console under **Analytics & AI > Generative AI**. Ensure your tenancy has the required permissions for `generative-ai-family`.

0 commit comments

Comments
 (0)