The Kernel Hub allows Python libraries and applications to load compute
-kernels directly from the [Hub](https://hf.co/). To support this kind
+kernels directly from the [Hub](https://huggingface.co/). To support this kind
of dynamic loading, Hub kernels differ from traditional Python kernel
packages in that they are made to be:
@@ -57,8 +63,7 @@ activation.gelu_fast(y, x)
print(y)
```
-You can [search for kernels](https://huggingface.co/models?other=kernels) on
-the Hub.
+Browse available kernels at [huggingface.co/kernels](https://huggingface.co/kernels).
## 📚 Documentation
diff --git a/docs/source/builder/build.md b/docs/source/builder/build.md
index 38b0fd21..4069c513 100644
--- a/docs/source/builder/build.md
+++ b/docs/source/builder/build.md
@@ -179,6 +179,13 @@ $ cd mykernel
$ kernel-builder build-and-upload
```
+> [!NOTE]
+> Uploads go to a `kernel`-type Hub repository (the first-class kernel
+> repository type). The owning user or org must have kernel-creation
+> access. Request it from
+> [huggingface.co/settings/account](https://huggingface.co/settings/account)
+> ("Request Kernels Creation").
+
Aside from building and uploading the kernel itself, this will also fill
the card template and upload it as `README.md` to the Hub if the card
template is provided in the source repository as `CARD.md`.
diff --git a/docs/source/index.md b/docs/source/index.md
index f3c4d2d2..230df308 100644
--- a/docs/source/index.md
+++ b/docs/source/index.md
@@ -1,13 +1,17 @@
# Kernels
The Kernel Hub allows Python libraries and applications to load compute
-kernels directly from the [Hub](https://hf.co/). To support this kind
-of dynamic loading, Hub kernels differ from traditional Python kernel
-packages in that they are made to be:
+kernels directly from the [Hub](https://huggingface.co/). Kernels are a first-class
+repository type on the Hub, with dedicated pages that surface supported
+hardware and versions. To support dynamic loading, Hub kernels differ from
+traditional Python kernel packages in that they are made to be:
- **Portable**: a kernel can be loaded from paths outside `PYTHONPATH`.
- **Unique**: multiple versions of the same kernel can be loaded in the
@@ -16,8 +20,7 @@ packages in that they are made to be:
the different PyTorch build configurations (various CUDA versions
and C++ ABIs). Furthermore, older C library versions must be supported.
-You can [search for kernels](https://huggingface.co/models?other=kernels) on
-the Hub.
+Browse available kernels at [huggingface.co/kernels](https://huggingface.co/kernels).
If you're looking for a more involved "Why kernels?" answer, refer to
[this page](./why_kernels.md).
\ No newline at end of file
diff --git a/docs/source/integrating-kernels.md b/docs/source/integrating-kernels.md
index 5b0a27ab..5d222b82 100644
--- a/docs/source/integrating-kernels.md
+++ b/docs/source/integrating-kernels.md
@@ -36,5 +36,5 @@ Besides leveraging pre-built compute kernels, different projects
rely on `kernels` to also package, build, and distribute their
kernels on the Hugging Face Hub platform. This is made possible by the
["builder" component of `kernels`](./builder/writing-kernels.md).
-Visit [this page](https://huggingface.co/models?other=kernels) to find out
-different pre-built compute kernels available on the Hub.
\ No newline at end of file
+Visit [huggingface.co/kernels](https://huggingface.co/kernels) to browse
+the pre-built compute kernels available on the Hub.
\ No newline at end of file
diff --git a/docs/source/kernel-requirements.md b/docs/source/kernel-requirements.md
index 5b1c401f..64f9c621 100644
--- a/docs/source/kernel-requirements.md
+++ b/docs/source/kernel-requirements.md
@@ -7,6 +7,30 @@ systems and Torch builds.
[Join us on Discord](https://discord.gg/H6Tkmd88N3) for questions and discussions
about building kernels!
+## Repository type
+
+Compliant kernels are published as `kernel`-type repositories on the Hub
+(the first-class kernel repository type). New uploads via `kernel-builder`
+default to this type; see the [migration guide](migration.md) if you
+maintain an older `model`-type kernel repository.
+
+## Trusted publishers
+
+`kernels` only loads kernels from a curated set of trusted publishers by
+default. Loading from any other publisher raises an error unless the caller
+opts in with `trust_remote_code=True`:
+
+```python
+# Trusted publisher: works without opt-in.
+get_kernel("kernels-community/activation", version=1)
+
+# Untrusted publisher: must opt in explicitly.
+get_kernel("some-other-org/my-kernel", version=1, trust_remote_code=True)
+```
+
+The Hub also exposes a `trustedKernelPublisher` flag on the kernel API and
+displays a corresponding badge in the UI.
+
## Directory layout
A kernel repository on the Hub must contain a `build` directory. This
diff --git a/docs/source/migration.md b/docs/source/migration.md
index 72760661..67612539 100644
--- a/docs/source/migration.md
+++ b/docs/source/migration.md
@@ -66,3 +66,28 @@ kernel_layer_mapping = {
}
}
```
+
+## 0.14
+
+### `kernel` repo type on the Hub
+
+Kernels are now a first-class repository type on the Hugging Face Hub, and
+`kernels` 0.14 loads kernels exclusively from `kernel`-type repositories.
+`model`-type kernel repositories are no longer supported by the loader.
+
+New uploads via `kernel-builder build-and-upload` default to
+`--repo-type kernel`. To publish, the owning user or org must have
+kernel-creation access. Request it from
+[huggingface.co/settings/account](https://huggingface.co/settings/account)
+("Request Kernels Creation").
+
+To migrate an existing `model`-type kernel repository:
+
+1. Make sure the publishing org has been granted kernel-creation access
+ (see above).
+2. Re-upload with `kernel-builder build-and-upload` to a `kernel`-type
+ repository. Either keep the same `repo-id` in `build.toml` if the
+ repository has been migrated to the new type, or point it at a newly
+ created `kernel`-type repository.
+3. Update consumers' `get_kernel(...)` and `LayerRepository(...)` calls
+ to reference the new repository if the `repo-id` changed.