You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Update sampling parameter sections for experimentation (#196)
Explain the nature of sampling parameter support for extensions and experimental web contexts.
Refine some surrounding text for recently renamed identifiers.
Copy file name to clipboardExpand all lines: README.md
+22-21Lines changed: 22 additions & 21 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -45,24 +45,23 @@ The following are potential goals we are not yet certain of:
45
45
46
46
Both of these potential goals could pose challenges to interoperability, so we want to investigate more how important such functionality is to developers to find the right tradeoff.
47
47
48
-
### API Updates: Deprecations and Renaming
48
+
##Experiments and Updates
49
49
50
-
To improve API clarity, consistency, and address inconsistencies in parameter support across various models, the LanguageModel API has been updated.
50
+
### Sampling Parameters
51
51
52
-
**Deprecated Features:**
52
+
Developers have expressed the value of tuning language model sampling parameters for testing and optimizing task-specific model behavior. At the same time, web standards engagements have highlighted the need for more interoperable API shapes for sampling parameters among different models.
53
53
54
-
The following features of the LanguageModel API are **deprecated** and their functionality is now restricted to web extension contexts only:
54
+
The API was initially made available in extension contexts with the following sampling parameter options and attributes:
55
55
56
-
* The static method `LanguageModel.params()`
57
-
* The instance attributes `languageModel.topK` and `languageModel.temperature`
58
-
* The `LanguageModelParams` interface and all its attributes (`defaultTopK`, `maxTopK`, `defaultTemperature`, `maxTemperature`)
59
-
* The `topK` and `temperature` options within `LanguageModel.create()`
56
+
* The static method `LanguageModel.params()`, which exposes default and maximum values for sampling parameters: `defaultTemperature`, `maxTemperature`, `defaultTopK`, `maxTopK`.
57
+
* The `temperature` and `topK` options, which may be provided to `LanguageModel.create()` to control the sampling behavior of individual language model sessions.
58
+
* The `temperature` and `topK` attributes on `LanguageModel` session instances, which expose the current values of the sampling parameters for that session.
60
59
61
-
These features may be completely removed in the future. This change is intended to simplify the API and address inconsistencies in parameter support across various models.
60
+
Access to these features is limited to extension and experimental web contexts. Ongoing experimentation and community engagement will explore different API shapes that satisfy developer requirements and address interoperability concerns.
62
61
63
-
**Renamed Features:**
62
+
### Renamed Features
64
63
65
-
The following features have been renamed. The old names are now deprecated and will only function as aliases within Chrome Extension contexts:
64
+
The following features have been recently renamed. The legacy aliases are deprecated, and clients should update their code to use the new names. The legacy aliases will be removed from extension contexts in a future release.
66
65
67
66
| Old Name (Deprecated in Extensions, Removed in Web) | New Name (Available in All Contexts) |
**Note:** Developers using any of the deprecated features within an extension context will receive warnings in the DevTools Issues tab. These deprecated features and aliases may be completely removed in a future Chrome release.
75
73
76
74
## Examples
77
75
@@ -454,23 +452,26 @@ Note that `append()` can also cause [overflow](#tokenization-context-window-leng
454
452
455
453
### Configuration of per-session parameters
456
454
457
-
In addition to the `initialPrompts` option shown above, in extension contexts, the currently-configurable model parameters are[temperature](https://huggingface.co/blog/how-to-generate#sampling) and [top-K](https://huggingface.co/blog/how-to-generate#top-k-sampling). The `params()` API gives the default and maximum values for these parameters.
455
+
Tuning language model sampling parameters can be useful for both testing and adjusting task-specific model behavior. Common sampling parameters include[temperature](https://huggingface.co/blog/how-to-generate#sampling) and [topK](https://huggingface.co/blog/how-to-generate#top-k-sampling).
458
456
459
-
**Deprecation Notice:** The `topK` and `temperature` options for `LanguageModel.create()`, the `LanguageModel.params()` static method, and the `languageModel.topK` and `languageModel.temperature` instance attributes are now **deprecated**. These features are only functional within web extension contexts and will be ignored or unavailable in standard web page contexts. They may be completely removed in a future release.
460
-
461
-
The `LanguageModel.params()` API, only available in extensions, can be used to query the default and maximum values for these parameters.
457
+
**Notice:** Sampling parameter features are currently only available within extension and experimental contexts. While they are useful for exploring model behavior, the current fields are not guaranteed to be supported or interpreted consistently across all models or user agents.
462
458
463
459
_The limited applicability and non-universal nature of these sampling hyperparameters are discussed further in [issue #42](https://github.com/webmachinelearning/prompt-api/issues/42): sampling hyperparameters are not universal among models._
464
460
461
+
In extension and experimental contexts:
462
+
* The `LanguageModel.params()` static method provides default and maximum values for temperature and topK parameters, once the user agent has ascertained or downloaded the specific underlying model.
463
+
* The `temperature` and `topK` instance attributes provide the current values for these parameters for a given session.
464
+
* Sampling parameters can also be configured at session creation time via the `temperature` and `topK` options for `LanguageModel.create()`
465
+
466
+
467
+
465
468
```js
466
-
//The topK and temperature members of the options object are deprecated. They will only be considered when
467
-
//LanguageModel.create() is called from within a Chrome Extension. In web page contexts, they are ignored.
469
+
//Sampling parameter support is limited to extension and experimental web contexts.
470
+
//Accessors are undefined, and options are ignored, outside of those contexts.
468
471
constcustomSession=awaitLanguageModel.create({
469
472
temperature:0.8,
470
473
topK:10
471
474
});
472
-
// This interface and all its attributes (`defaultTopK`, `maxTopK`, `defaultTemperature`, `maxTemperature`)
473
-
// are now only available within Chrome Extension contexts. Web pages can no longer call this method.
0 commit comments