Skip to content

Commit 509a2d8

Browse files
Update sampling parameter sections for experimentation (#196)
Explain the nature of sampling parameter support for extensions and experimental web contexts. Refine some surrounding text for recently renamed identifiers.
1 parent 7ff8e94 commit 509a2d8

File tree

2 files changed

+28
-27
lines changed

2 files changed

+28
-27
lines changed

README.md

Lines changed: 22 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -45,24 +45,23 @@ The following are potential goals we are not yet certain of:
4545

4646
Both of these potential goals could pose challenges to interoperability, so we want to investigate more how important such functionality is to developers to find the right tradeoff.
4747

48-
### API Updates: Deprecations and Renaming
48+
## Experiments and Updates
4949

50-
To improve API clarity, consistency, and address inconsistencies in parameter support across various models, the LanguageModel API has been updated.
50+
### Sampling Parameters
5151

52-
**Deprecated Features:**
52+
Developers have expressed the value of tuning language model sampling parameters for testing and optimizing task-specific model behavior. At the same time, web standards engagements have highlighted the need for more interoperable API shapes for sampling parameters among different models.
5353

54-
The following features of the LanguageModel API are **deprecated** and their functionality is now restricted to web extension contexts only:
54+
The API was initially made available in extension contexts with the following sampling parameter options and attributes:
5555

56-
* The static method `LanguageModel.params()`
57-
* The instance attributes `languageModel.topK` and `languageModel.temperature`
58-
* The `LanguageModelParams` interface and all its attributes (`defaultTopK`, `maxTopK`, `defaultTemperature`, `maxTemperature`)
59-
* The `topK` and `temperature` options within `LanguageModel.create()`
56+
* The static method `LanguageModel.params()`, which exposes default and maximum values for sampling parameters: `defaultTemperature`, `maxTemperature`, `defaultTopK`, `maxTopK`.
57+
* The `temperature` and `topK` options, which may be provided to `LanguageModel.create()` to control the sampling behavior of individual language model sessions.
58+
* The `temperature` and `topK` attributes on `LanguageModel` session instances, which expose the current values of the sampling parameters for that session.
6059

61-
These features may be completely removed in the future. This change is intended to simplify the API and address inconsistencies in parameter support across various models.
60+
Access to these features is limited to extension and experimental web contexts. Ongoing experimentation and community engagement will explore different API shapes that satisfy developer requirements and address interoperability concerns.
6261

63-
**Renamed Features:**
62+
### Renamed Features
6463

65-
The following features have been renamed. The old names are now deprecated and will only function as aliases within Chrome Extension contexts:
64+
The following features have been recently renamed. The legacy aliases are deprecated, and clients should update their code to use the new names. The legacy aliases will be removed from extension contexts in a future release.
6665

6766
| Old Name (Deprecated in Extensions, Removed in Web) | New Name (Available in All Contexts) |
6867
| :-------------------------------------------------- | :----------------------------------------|
@@ -71,7 +70,6 @@ The following features have been renamed. The old names are now deprecated and w
7170
| `languageModel.measureInputUsage()` | `languageModel.measureContextUsage()` |
7271
| `languagemodel.onquotaoverflow` | `languagemodel.oncontextoverflow`. |
7372

74-
**Note:** Developers using any of the deprecated features within an extension context will receive warnings in the DevTools Issues tab. These deprecated features and aliases may be completely removed in a future Chrome release.
7573

7674
## Examples
7775

@@ -454,23 +452,26 @@ Note that `append()` can also cause [overflow](#tokenization-context-window-leng
454452

455453
### Configuration of per-session parameters
456454

457-
In addition to the `initialPrompts` option shown above, in extension contexts, the currently-configurable model parameters are [temperature](https://huggingface.co/blog/how-to-generate#sampling) and [top-K](https://huggingface.co/blog/how-to-generate#top-k-sampling). The `params()` API gives the default and maximum values for these parameters.
455+
Tuning language model sampling parameters can be useful for both testing and adjusting task-specific model behavior. Common sampling parameters include [temperature](https://huggingface.co/blog/how-to-generate#sampling) and [topK](https://huggingface.co/blog/how-to-generate#top-k-sampling).
458456

459-
**Deprecation Notice:** The `topK` and `temperature` options for `LanguageModel.create()`, the `LanguageModel.params()` static method, and the `languageModel.topK` and `languageModel.temperature` instance attributes are now **deprecated**. These features are only functional within web extension contexts and will be ignored or unavailable in standard web page contexts. They may be completely removed in a future release.
460-
461-
The `LanguageModel.params()` API, only available in extensions, can be used to query the default and maximum values for these parameters.
457+
**Notice:** Sampling parameter features are currently only available within extension and experimental contexts. While they are useful for exploring model behavior, the current fields are not guaranteed to be supported or interpreted consistently across all models or user agents.
462458

463459
_The limited applicability and non-universal nature of these sampling hyperparameters are discussed further in [issue #42](https://github.com/webmachinelearning/prompt-api/issues/42): sampling hyperparameters are not universal among models._
464460

461+
In extension and experimental contexts:
462+
* The `LanguageModel.params()` static method provides default and maximum values for temperature and topK parameters, once the user agent has ascertained or downloaded the specific underlying model.
463+
* The `temperature` and `topK` instance attributes provide the current values for these parameters for a given session.
464+
* Sampling parameters can also be configured at session creation time via the `temperature` and `topK` options for `LanguageModel.create()`
465+
466+
467+
465468
```js
466-
// The topK and temperature members of the options object are deprecated. They will only be considered when
467-
// LanguageModel.create() is called from within a Chrome Extension. In web page contexts, they are ignored.
469+
// Sampling parameter support is limited to extension and experimental web contexts.
470+
// Accessors are undefined, and options are ignored, outside of those contexts.
468471
const customSession = await LanguageModel.create({
469472
temperature: 0.8,
470473
topK: 10
471474
});
472-
// This interface and all its attributes (`defaultTopK`, `maxTopK`, `defaultTemperature`, `maxTemperature`)
473-
// are now only available within Chrome Extension contexts. Web pages can no longer call this method.
474475
const params = await LanguageModel.params();
475476
const conditionalSession = await LanguageModel.create({
476477
temperature: isCreativeTask ? params.defaultTemperature * 1.1 : params.defaultTemperature * 0.8,
@@ -831,7 +832,7 @@ Note that although the API is not exposed to web platform workers, a browser cou
831832
To actually get a response back from the model given a prompt, the following possible stages are involved:
832833

833834
1. Download the model, if necessary.
834-
2. Establish a session, including configuring per-session options and parameters.
835+
2. Establish a session, including configuring per-session options.
835836
3. Add an initial prompt to establish context. (This will not generate a response.)
836837
4. Execute a prompt and receive a response.
837838

index.bs

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ These APIs are part of a family of APIs expected to be powered by machine learni
3737
interface LanguageModel : EventTarget {
3838
static Promise<LanguageModel> create(optional LanguageModelCreateOptions options = {});
3939
static Promise<Availability> availability(optional LanguageModelCreateCoreOptions options = {});
40-
// **DEPRECATED**: This method is only available in extension contexts.
40+
// **EXPERIMENTAL**: Only available in extension and experimental contexts.
4141
static Promise<LanguageModelParams?> params();
4242

4343
// These will throw "NotSupportedError" DOMExceptions if role = "system"
@@ -75,16 +75,16 @@ interface LanguageModel : EventTarget {
7575
// **DEPRECATED**: This attribute is only available in extension contexts.
7676
attribute EventHandler onquotaoverflow;
7777

78-
// **DEPRECATED**: This attribute is only available in extension contexts.
78+
// **EXPERIMENTAL**: Only available in extension and experimental contexts.
7979
readonly attribute unsigned long topK;
80-
// **DEPRECATED**: This attribute is only available in extension contexts.
80+
// **EXPERIMENTAL**: Only available in extension and experimental contexts.
8181
readonly attribute float temperature;
8282

8383
Promise<LanguageModel> clone(optional LanguageModelCloneOptions options = {});
8484
undefined destroy();
8585
};
8686

87-
// **DEPRECATED**: This interface and its attributes are only available in extension contexts.
87+
// **EXPERIMENTAL**: Only available in extension and experimental contexts.
8888
[Exposed=Window, SecureContext]
8989
interface LanguageModelParams {
9090
readonly attribute unsigned long defaultTopK;
@@ -109,9 +109,9 @@ dictionary LanguageModelTool {
109109
dictionary LanguageModelCreateCoreOptions {
110110
// Note: these two have custom out-of-range handling behavior, not in the IDL layer.
111111
// They are unrestricted double so as to allow +Infinity without failing.
112-
// **DEPRECATED**: This option is only allowed in extension contexts.
112+
// **EXPERIMENTAL**: Only available in extension and experimental contexts.
113113
unrestricted double topK;
114-
// **DEPRECATED**: This option is only allowed in extension contexts.
114+
// **EXPERIMENTAL**: Only available in extension and experimental contexts.
115115
unrestricted double temperature;
116116

117117
sequence<LanguageModelExpected> expectedInputs;

0 commit comments

Comments
 (0)