Skip to content

Commit e96db71

Browse files
[Add] DeepSeek App Quick Deploy / Marketplace Documentation (#7406)
* initial commit * update UI image * dictionary addition --------- Co-authored-by: jddocs <jdutton@akamai.com>
1 parent ccad70c commit e96db71

4 files changed

Lines changed: 89 additions & 0 deletions

File tree

Lines changed: 89 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,89 @@
1+
---
2+
title: "Deploy DeepSeek R1"
3+
description: "Learn how to deploy DeepSeek R1, a distilled open-weight reasoning model from DeepSeek, on an Akamai Compute Instance."
4+
published: 2026-02-26
5+
modified: 2026-02-26
6+
keywords: ['artificial intelligence', 'ai', 'LLM', 'machine learning', 'deepseek', 'deepseek-r1', 'open webui', 'vllm', 'reasoning']
7+
tags: ["quick deploy apps", "linode platform", "cloud manager"]
8+
aliases: ['/products/tools/marketplace/guides/deepseek-with-openwebui/']
9+
external_resources:
10+
- '[Open WebUI Documentation](https://docs.openwebui.com/getting-started/)'
11+
- '[DeepSeek R1 Distill Qwen 7B on Hugging Face](https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Qwen-7B)'
12+
- '[DeepSeek R1 Distill Qwen 14B on Hugging Face](https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Qwen-14B)'
13+
- '[DeepSeek R1 Distill Qwen 32B on Hugging Face](https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Qwen-32B)'
14+
authors: ["Akamai"]
15+
contributors: ["Akamai"]
16+
license: '[CC BY-ND 4.0](https://creativecommons.org/licenses/by-nd/4.0)'
17+
marketplace_app_id: PLACEHOLDER
18+
marketplace_app_name: "DeepSeek R1 with Open WebUI"
19+
---
20+
21+
Open WebUI is an open-source, self-hosted web interface for interacting with and managing Large Language Models (LLMs). It supports multiple AI backends, multi-user access, and extensible integrations, enabling secure and customizable deployment for local or remote model inference.
22+
23+
The Quick Deploy App deployed in this guide uses DeepSeek R1 Distill Qwen, a family of distilled open-weight reasoning models based on Qwen2.5. These models are derived from the full 671B DeepSeek-R1 and feature chain-of-thought reasoning capabilities. During deployment, you can choose between three model sizes: 7B (default), 14B, or 32B. All models are released under the MIT license.
24+
25+
## Deploying a Marketplace App
26+
27+
{{% content "deploy-marketplace-apps-shortguide" %}}
28+
29+
{{% content "marketplace-verify-standard-shortguide" %}}
30+
31+
{{< note title="Estimated deployment time" >}}
32+
Open WebUI with DeepSeek R1 should be fully installed within 5-10 minutes after the Compute Instance has finished provisioning.
33+
{{< /note >}}
34+
35+
## Configuration Options
36+
37+
- **Recommended plan for DeepSeek R1 Distill Qwen 7B (default):** Any 1-GPU instance (16GB RAM minimum)
38+
- **Recommended plan for DeepSeek R1 Distill Qwen 14B:** Any 2-GPU instance or higher (32GB RAM minimum)
39+
- **Recommended plan for DeepSeek R1 Distill Qwen 32B:** Any 4-GPU instance (128GB RAM minimum)
40+
41+
{{< note type="warning" >}}
42+
This Quick Deploy App only works with Akamai GPU instances. If you choose a plan other than GPUs, the provisioning will fail, and a notice will appear in the LISH console.
43+
{{< /note >}}
44+
45+
### DeepSeek R1 Options
46+
47+
- **Linode API Token** *(required)*: Your API token is used to deploy additional Compute Instances as part of this cluster. At a minimum, this token must have Read/Write access to *Linodes*. If you do not yet have an API token, see [Get an API Access Token](/docs/products/platform/accounts/guides/manage-api-tokens/) to create one.
48+
49+
- **Email address (for the Let's Encrypt SSL certificate)** *(required)*: Your email is used for Let's Encrypt renewal notices. This allows you to visit Open WebUI securely through a browser.
50+
51+
- **Open WebUI admin name.** *(required)*: This is the name associated with your login and is required by Open WebUI during initial enrollment.
52+
53+
- **Open WebUI admin email.** *(required)*: This is the email used to login into Open WebUI.
54+
55+
- **DeepSeek Model Size** *(required)*: Select the model size for deployment. Options are `7B` (default), `14B`, or `32B`. Larger models require more GPUs and memory. The 14B and 32B models automatically use tensor parallelism across multiple GPUs.
56+
57+
{{% content "marketplace-required-limited-user-fields-shortguide" %}}
58+
59+
{{% content "marketplace-special-character-limitations-shortguide" %}}
60+
61+
## Getting Started After Deployment
62+
63+
### Accessing Open WebUI Frontend
64+
65+
Once your app has finished deploying, you can log into Open WebUI using your browser.
66+
67+
1. Log into the instance as your limited sudo user, replacing `{{< placeholder "USER" >}}` with the sudo username you created, and `{{< placeholder "IP_ADDRESS" >}}` with the instance's IPv4 address:
68+
69+
```command
70+
ssh {{< placeholder "USER" >}}@{{< placeholder "IP_ADDRESS" >}}
71+
```
72+
73+
2. Upon logging into the instance, a banner appears containing the **App URL**. Open your browser and paste the link to direct you to the login for Open WebUI.
74+
75+
!["Open WebUI Login Page"](openwebui-login.png "Open WebUI Login Page")
76+
77+
3. Return to your terminal, and open the `.credentials` file with the following command. Replace `{{< placeholder "USER" >}}` with your sudo username:
78+
79+
```command
80+
sudo cat /home/{{< placeholder "USER" >}}/.credentials
81+
```
82+
83+
4. In the `.credentials` file, locate the Open WebUI login email and password. Go back to the Open WebUI login page, and paste the credentials to log in. When you successfully login, you should see the following page.
84+
85+
!["Open WebUI Welcome 1"](openwebui-w1.png "Open WebUI Welcome 1")
86+
87+
Once you hit the "Okay, Let's Go!" button, you can start to use the chat feature in Open WebUI.
88+
89+
!["Open WebUI Welcome 2"](openwebui-w2.png "Open WebUI Welcome 2")
13.9 KB
Loading
155 KB
Loading
169 KB
Loading

0 commit comments

Comments
 (0)