Skip to content

Commit eb4f916

Browse files
Merge pull request #7432 from nmelehan-akamai/rc-v1.401.0
[Release] v1.401.0
2 parents 543f5dc + f1a3e8e commit eb4f916

12 files changed

Lines changed: 235 additions & 7 deletions

File tree

Lines changed: 89 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,89 @@
1+
---
2+
title: "Deploy DeepSeek R1"
3+
description: "Learn how to deploy DeepSeek R1, a distilled open-weight reasoning model from DeepSeek, on an Akamai Compute Instance."
4+
published: 2026-02-26
5+
modified: 2026-02-26
6+
keywords: ['artificial intelligence', 'ai', 'LLM', 'machine learning', 'deepseek', 'deepseek-r1', 'open webui', 'vllm', 'reasoning']
7+
tags: ["quick deploy apps", "linode platform", "cloud manager"]
8+
aliases: ['/products/tools/marketplace/guides/deepseek-with-openwebui/']
9+
external_resources:
10+
- '[Open WebUI Documentation](https://docs.openwebui.com/getting-started/)'
11+
- '[DeepSeek R1 Distill Qwen 7B on Hugging Face](https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Qwen-7B)'
12+
- '[DeepSeek R1 Distill Qwen 14B on Hugging Face](https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Qwen-14B)'
13+
- '[DeepSeek R1 Distill Qwen 32B on Hugging Face](https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Qwen-32B)'
14+
authors: ["Akamai"]
15+
contributors: ["Akamai"]
16+
license: '[CC BY-ND 4.0](https://creativecommons.org/licenses/by-nd/4.0)'
17+
marketplace_app_id: PLACEHOLDER
18+
marketplace_app_name: "DeepSeek R1 with Open WebUI"
19+
---
20+
21+
Open WebUI is an open-source, self-hosted web interface for interacting with and managing Large Language Models (LLMs). It supports multiple AI backends, multi-user access, and extensible integrations, enabling secure and customizable deployment for local or remote model inference.
22+
23+
The Quick Deploy App deployed in this guide uses DeepSeek R1 Distill Qwen, a family of distilled open-weight reasoning models based on Qwen2.5. These models are derived from the full 671B DeepSeek-R1 and feature chain-of-thought reasoning capabilities. During deployment, you can choose between three model sizes: 7B (default), 14B, or 32B. All models are released under the MIT license.
24+
25+
## Deploying a Marketplace App
26+
27+
{{% content "deploy-marketplace-apps-shortguide" %}}
28+
29+
{{% content "marketplace-verify-standard-shortguide" %}}
30+
31+
{{< note title="Estimated deployment time" >}}
32+
Open WebUI with DeepSeek R1 should be fully installed within 5-10 minutes after the Compute Instance has finished provisioning.
33+
{{< /note >}}
34+
35+
## Configuration Options
36+
37+
- **Recommended plan for DeepSeek R1 Distill Qwen 7B (default):** Any 1-GPU instance (16GB RAM minimum)
38+
- **Recommended plan for DeepSeek R1 Distill Qwen 14B:** Any 2-GPU instance or higher (32GB RAM minimum)
39+
- **Recommended plan for DeepSeek R1 Distill Qwen 32B:** Any 4-GPU instance (128GB RAM minimum)
40+
41+
{{< note type="warning" >}}
42+
This Quick Deploy App only works with Akamai GPU instances. If you choose a plan other than GPUs, the provisioning will fail, and a notice will appear in the LISH console.
43+
{{< /note >}}
44+
45+
### DeepSeek R1 Options
46+
47+
- **Linode API Token** *(required)*: Your API token is used to deploy additional Compute Instances as part of this cluster. At a minimum, this token must have Read/Write access to *Linodes*. If you do not yet have an API token, see [Get an API Access Token](/docs/products/platform/accounts/guides/manage-api-tokens/) to create one.
48+
49+
- **Email address (for the Let's Encrypt SSL certificate)** *(required)*: Your email is used for Let's Encrypt renewal notices. This allows you to visit Open WebUI securely through a browser.
50+
51+
- **Open WebUI admin name.** *(required)*: This is the name associated with your login and is required by Open WebUI during initial enrollment.
52+
53+
- **Open WebUI admin email.** *(required)*: This is the email used to login into Open WebUI.
54+
55+
- **DeepSeek Model Size** *(required)*: Select the model size for deployment. Options are `7B` (default), `14B`, or `32B`. Larger models require more GPUs and memory. The 14B and 32B models automatically use tensor parallelism across multiple GPUs.
56+
57+
{{% content "marketplace-required-limited-user-fields-shortguide" %}}
58+
59+
{{% content "marketplace-special-character-limitations-shortguide" %}}
60+
61+
## Getting Started After Deployment
62+
63+
### Accessing Open WebUI Frontend
64+
65+
Once your app has finished deploying, you can log into Open WebUI using your browser.
66+
67+
1. Log into the instance as your limited sudo user, replacing `{{< placeholder "USER" >}}` with the sudo username you created, and `{{< placeholder "IP_ADDRESS" >}}` with the instance's IPv4 address:
68+
69+
```command
70+
ssh {{< placeholder "USER" >}}@{{< placeholder "IP_ADDRESS" >}}
71+
```
72+
73+
2. Upon logging into the instance, a banner appears containing the **App URL**. Open your browser and paste the link to direct you to the login for Open WebUI.
74+
75+
!["Open WebUI Login Page"](openwebui-login.png "Open WebUI Login Page")
76+
77+
3. Return to your terminal, and open the `.credentials` file with the following command. Replace `{{< placeholder "USER" >}}` with your sudo username:
78+
79+
```command
80+
sudo cat /home/{{< placeholder "USER" >}}/.credentials
81+
```
82+
83+
4. In the `.credentials` file, locate the Open WebUI login email and password. Go back to the Open WebUI login page, and paste the credentials to log in. When you successfully login, you should see the following page.
84+
85+
!["Open WebUI Welcome 1"](openwebui-w1.png "Open WebUI Welcome 1")
86+
87+
Once you hit the "Okay, Let's Go!" button, you can start to use the chat feature in Open WebUI.
88+
89+
!["Open WebUI Welcome 2"](openwebui-w2.png "Open WebUI Welcome 2")
13.9 KB
Loading
155 KB
Loading
169 KB
Loading

docs/marketplace-docs/guides/gemma3/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -49,7 +49,7 @@ Before deployment, you need a Hugging Face API token to access the Gemma 3 model
4949
1. Create a free account at [huggingface.co/join](https://huggingface.co/join).
5050
1. Accept the Gemma license at [huggingface.co/google/gemma-3-12b-it](https://huggingface.co/google/gemma-3-12b-it).
5151
1. Generate a token at [huggingface.co/settings/tokens](https://huggingface.co/settings/tokens). Read-only access is sufficient.
52-
1. Provide this token during the Marketplace deployment process.
52+
1. Provide this token during the deployment process.
5353

5454
{{% content "marketplace-required-limited-user-fields-shortguide" %}}
5555

docs/marketplace-docs/guides/gpt-oss-with-openwebui/index.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
title: "Deploy GPT-OSS with Open WebUI through the Linode Marketplace"
2+
title: "Deploy GPT-OSS with Open WebUI"
33
description: "This guide includes instructions on how to deploy Open WebUI with GPT-OSS self-hosted LLM on an Akamai Compute Instance."
44
published: 2026-02-12
55
modified: 2026-02-12
@@ -19,9 +19,9 @@ marketplace_app_name: "GPT-OSS with Open WebUI"
1919

2020
Open WebUI is an open-source, self-hosted web interface for interacting with and managing Large Language Models (LLMs). It supports multiple AI backends, multi-user access, and extensible integrations, enabling secure and customizable deployment for local or remote model inference.
2121

22-
The Marketplace application deployed in this guide uses OpenAI GPT-OSS, a family of open-weight large language models designed for powerful reasoning, agentic tasks, and versatile developer use cases. During deployment, you can choose between two model sizes: GPT-OSS 20B (default) or GPT-OSS 120B. These models are released under the permissive Apache 2.0 license and integrate well with self-hosted platforms like Open WebUI for general-purpose assistance, coding, and knowledge-based workflows.
22+
The Quick Deploy App deployed in this guide uses OpenAI GPT-OSS, a family of open-weight large language models designed for powerful reasoning, agentic tasks, and versatile developer use cases. During deployment, you can choose between two model sizes: GPT-OSS 20B (default) or GPT-OSS 120B. These models are released under the permissive Apache 2.0 license and integrate well with self-hosted platforms like Open WebUI for general-purpose assistance, coding, and knowledge-based workflows.
2323

24-
## Deploying a Marketplace App
24+
## Deploying a Quick Deploy App
2525

2626
{{% content "deploy-marketplace-apps-shortguide" %}}
2727

@@ -37,7 +37,7 @@ Open WebUI with GPT-OSS should be fully installed within 5-10 minutes after the
3737
- **Recommended plan for GPT-OSS 120B:** RTX4000 Ada x1 Large or higher (64GB RAM minimum)
3838

3939
{{< note type="warning" >}}
40-
This Marketplace App only works with Akamai GPU instances. If you choose a plan other than GPUs, the provisioning will fail, and a notice will appear in the LISH console.
40+
This Quick Deploy App only works with Akamai GPU instances. If you choose a plan other than GPUs, the provisioning will fail, and a notice will appear in the LISH console.
4141
{{< /note >}}
4242

4343
### GPT-OSS Options
Lines changed: 139 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,139 @@
1+
---
2+
title: "Deploy OpenClaw"
3+
description: "This tutorial will show you how deploy OpenClaw as a Quick Deploy App."
4+
published: 2026-03-17
5+
modified: 2026-03-17
6+
keywords: ['AI', 'AI Agent']
7+
tags: ["quick deploy apps", "AI", "AI Agent"]
8+
aliases: ['/products/tools/marketplace/guides/openclaw/','/guides/openclaw/']
9+
external_resources:
10+
- '[OpenClaw](https://openclaw.ai/)'
11+
- '[OpenClaw Documentation](https://docs.openclaw.ai/)'
12+
authors: ["Akamai"]
13+
contributors: ["Akamai"]
14+
license: '[CC BY-ND 4.0](https://creativecommons.org/licenses/by-nd/4.0)'
15+
marketplace_app_id: 2049320
16+
marketplace_app_name: "OpenClaw"
17+
---
18+
19+
[OpenClaw](https://openclaw.ai/) is an open-source AI agent platform that runs locally and executes tasks through a persistent Gateway service. The Gateway connects communication channels, tools, and AI models, allowing the agent to receive messages, perform actions, and automate workflows. Administrators configure and manage the system through a CLI onboarding wizard and a local web dashboard. Our Quick Deploy App allows you to connect to the OpenClaw dashboard via a secure HTTPS endpoint protected by HTPASSWD.
20+
21+
This Quick Deploy App creates an OpenClaw limited user on the system called `openclaw`.
22+
23+
## Deploying a Quick Deploy App
24+
25+
{{% content "deploy-marketplace-apps-shortguide" %}}
26+
27+
{{% content "marketplace-verify-standard-shortguide" %}}
28+
29+
{{< note >}}
30+
**Estimated deployment time:** OpenClaw should be fully installed within 5-10 minutes after the Compute Instance has finished provisioning.
31+
{{< /note >}}
32+
33+
## Configuration Options
34+
35+
- **Supported distributions:** Ubuntu 24.04 LTS
36+
- **Recommended plan:** All plan types and sizes can be used.
37+
38+
## OpenClaw Options
39+
40+
- **Email address** *(required)*: Enter the email address you want to use for generating the SSL certificates via Let's Encrypt.
41+
42+
{{% content "marketplace-required-limited-user-fields-shortguide" %}}
43+
44+
{{% content "marketplace-custom-domain-fields-shortguide" %}}
45+
46+
{{% content "marketplace-special-character-limitations-shortguide" %}}
47+
48+
## Getting Started after Deployment
49+
50+
### Performing OpenClaw Onboard
51+
52+
Once the deployment is complete, `openclaw` will be installed on the instance but will not be running. Before you can start using OpenClaw, you need to go through the onboarding wizard. This Quick Deploy App triggers the onboarding for you when you log in as root.
53+
54+
1. Log into the instance.
55+
56+
If you disabled root login to the server during the setup of the OpenClaw app, you need to log into the server as the sudo user.
57+
58+
```command
59+
ssh admin@YOUR_INSTANCE_IP
60+
```
61+
62+
Replace `YOUR_INSTANCE_IP` with the IP address of your Linode instance and `admin` with the sudo user you created.
63+
64+
1. Escalate privileges to root.
65+
66+
Once you've logged in, note the [motd](https://man7.org/linux/man-pages/man5/motd.5.html):
67+
68+
```output
69+
*********************************************************
70+
Akamai Connected Cloud OpenClaw Quick Deploy App
71+
Dashboard URL: https://172-235-150-14.ip.linodeusercontent.com
72+
Credentials File: /home/admin/.credentials
73+
Documentation: https://www.linode.com/docs/marketplace-docs/guides/openclaw/
74+
*********************************************************
75+
```
76+
77+
Copy the sudo password from `~/.credentials.txt` and enter the following command from the terminal:
78+
79+
```command
80+
sudo su -
81+
```
82+
83+
When prompted for the password, paste the sudo password you got from the `~/.credentials.txt` file. When you log in as **root**, note the following message about the onboarding wizard:
84+
85+
![OpenClaw Init](openclaw-init.jpg)
86+
87+
If you are ready to perform the onboarding, enter `y` and it will take you to OpenClaw's onboarding wizard where you can complete the setup.
88+
89+
![OpenClaw Onboard](openclaw-onboard.jpg)
90+
91+
Once onboarding is complete, the onboarding script is removed.
92+
93+
### Confirm Gateway Status
94+
95+
At this time, you've configured OpenClaw on the server. To verify the gateway is running, you need to become the `openclaw` user. Enter the following from the terminal as the **root** user:
96+
97+
```command
98+
su - openclaw
99+
```
100+
101+
To view the gateway status, enter the following as the **openclaw** user:
102+
103+
```command
104+
openclaw gateway status
105+
```
106+
107+
That should yield the following output:
108+
109+
![OpenClaw GW Status](openclaw-gws.jpg)
110+
111+
### Dashboard Access
112+
113+
Once the onboarding is complete and the gateway is running, you can access the Dashboard from the domain you've configured in the initial deployment of the app. If you did not enter a domain name in from the start, the dashboard is accessible using the instance's rDNS value. You can view the rDNS value from the [Linode's Network](https://techdocs.akamai.com/cloud-computing/docs/configure-rdns-reverse-dns-on-a-compute-instance#setting-reverse-dns) tab. This example uses the domain `172-233-177-79.ip.linodeusercontent.com`.
114+
115+
To authenticate to the dashboard you need to provide two methods of authentication:
116+
117+
1. **Dashboard token**: If you didn't get a dashboard token during the onboarding steps, follow these steps.
118+
119+
1. Become the `openclaw` user:
120+
`su - openclaw`.
121+
122+
1. Run the following:
123+
`openclaw dashboard --no-open`
124+
125+
1. Get the entire token value `#token=a0764fb` from the `Dashboard URL:` link.
126+
127+
1. **Nginx basic auth**: Get the `Htpassword` password and `Htpasswd username` user from `/home/admin.credentials`.
128+
129+
Now you have everything you need to access the dashboard. For example:
130+
131+
`https://172-233-177-79.ip.linodeusercontent.com/#token=a0764fb`
132+
133+
When you access the web page, you will be prompted for the HTPASSWD details.
134+
135+
![Nginx Basic Auth](openclaw-htpasswd.jpg)
136+
137+
Enter the Username as **openclaw** and the Password from the `/home/admin.credentials` file.
138+
139+
{{% content "marketplace-update-note-shortguide" %}}
157 KB
Loading
21.2 KB
Loading
112 KB
Loading

0 commit comments

Comments
 (0)