Skip to content

Commit 3bad097

Browse files
authored
Merge pull request #976 from devoxx/fix/task-200
fix: omit top_p parameter for GPT-5 models
2 parents fc420e3 + 3834462 commit 3bad097

4 files changed

Lines changed: 119 additions & 10 deletions

File tree

backlog/tasks/task-193 - Fix-packaged-plugin-Skiko-native-runtime-mismatch.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,12 @@
11
---
22
id: TASK-193
33
title: Fix packaged plugin Skiko native runtime mismatch
4-
status: In Progress
4+
status: Done
5+
priority: medium
56
assignee:
67
- Codex
78
created_date: '2026-03-07 15:22'
8-
updated_date: '2026-03-07 15:29'
9+
updated_date: '2026-03-08 11:35'
910
labels:
1011
- build
1112
- plugin-distribution
@@ -14,21 +15,19 @@ dependencies: []
1415
references:
1516
- /Users/stephan/IdeaProjects/DevoxxGenieIDEAPlugin/build.gradle.kts
1617
- >-
17-
/Users/stephan/IdeaProjects/DevoxxGenieIDEAPlugin/build/idea-sandbox/IC-2024.3/plugins/DevoxxGenie/lib
18+
documentation: []
19+
ordinal: 1000
1820
---
1921

20-
## Description
21-
2222
<!-- SECTION:DESCRIPTION:BEGIN -->
2323
The built plugin distribution currently packages Skiko AWT classes at version 0.9.37.4 while the platform-specific skiko-awt-runtime native jars are pinned to 0.8.18. On macOS this causes UnsatisfiedLinkError in MetalApiKt when the Compose tool window initializes. The build should package matching Skiko runtime artifacts so the distributed plugin starts cleanly.
2424
<!-- SECTION:DESCRIPTION:END -->
2525

2626
## Acceptance Criteria
27-
<!-- AC:BEGIN -->
27+
2828
- [x] #1 The plugin build configuration no longer mixes Skiko AWT classes and skiko-awt-runtime native jars from different versions.
2929
- [x] #2 Rebuilding the plugin distribution produces a ZIP whose packaged Skiko runtime jars match the resolved Skiko AWT version.
3030
- [x] #3 The packaged plugin no longer includes the previously mismatched 0.8.18 Skiko runtime jars alongside skiko-awt 0.9.37.4.
31-
<!-- AC:END -->
3231

3332
## Implementation Plan
3433

@@ -47,3 +46,4 @@ User retested the rebuilt ZIP in IDEA and still hit the same MetalApi Unsatisfie
4746

4847
Rebuilt the plugin after stripping platform-provided Compose/Kotlin runtime jars from `prepareSandbox`. Verified that neither `build/idea-sandbox/IC-2024.3/plugins/DevoxxGenie/lib` nor `build/distributions/DevoxxGenie-1.0.0.zip` contains `skiko-*`, Compose desktop jars, `kotlin-stdlib*`, or `kotlinx-coroutines-core*` anymore. Awaiting runtime retest in IDEA.
4948
<!-- SECTION:NOTES:END -->
49+
Lines changed: 47 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,47 @@
1+
---
2+
id: TASK-200
3+
title: OpenAI GPT-5 does not support top_p parameter
4+
status: Done
5+
priority: medium
6+
assignee: []
7+
created_date: '2026-03-08 11:29'
8+
updated_date: '2026-03-08 11:35'
9+
labels:
10+
- bug
11+
- openai
12+
- gpt-5
13+
dependencies: []
14+
references:
15+
- >-
16+
documentation: []
17+
ordinal: 1000
18+
---
19+
20+
<!-- SECTION:DESCRIPTION:BEGIN -->
21+
When using OpenAI GPT-5, the plugin sends the `top_p` parameter which is not supported by this model, causing a runtime error.
22+
23+
**Error:**
24+
```
25+
ExecutionException - Error occurred while processing chat message
26+
Caused by: CompletionException - com.devoxx.genie.service.prompt.error.ModelException: Provider unavailable:
27+
{
28+
"error": {
29+
"message": "Unsupported parameter: 'top_p' is not supported with this model.",
30+
"type": "invalid_request_error",
31+
"param": "top_p",
32+
"code": "unsupported_parameter"
33+
}
34+
}
35+
```
36+
37+
**Root Cause:** The OpenAI chat model factory sends `top_p` as a parameter when building the chat model request, but GPT-5 does not accept it.
38+
39+
**Suggested Fix:** Conditionally omit `top_p` (and potentially `top_k`) for models that don't support these parameters. This could be handled in the `OpenAiChatModelFactory` by checking the model name or by catching the error and retrying without the unsupported parameter.
40+
<!-- SECTION:DESCRIPTION:END -->
41+
42+
## Acceptance Criteria
43+
44+
- [ ] #1 GPT-5 model works without top_p error
45+
- [ ] #2 Other OpenAI models that support top_p continue to work as before
46+
- [ ] #3 No regression in other cloud providers
47+

src/main/java/com/devoxx/genie/chatmodel/cloud/openai/OpenAIChatModelFactory.java

Lines changed: 9 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -47,12 +47,18 @@ public List<LanguageModel> getModels() {
4747
}
4848

4949
private ChatRequestParameters createChatContextParameters(@NotNull CustomChatModel customChatModel) {
50-
boolean isO1 = customChatModel.getModelName().toLowerCase().startsWith("o1");
51-
boolean isO3 = customChatModel.getModelName().toLowerCase().startsWith("o3");
50+
String modelName = customChatModel.getModelName().toLowerCase();
51+
boolean isReasoningModel = modelName.startsWith("o1") || modelName.startsWith("o3");
52+
boolean isGpt5 = modelName.startsWith("gpt-5");
5253

53-
if (isO1 || isO3) {
54+
if (isReasoningModel) {
5455
// o1 and o3 models do not support temperature and topP
5556
return ChatRequestParameters.builder().build();
57+
} else if (isGpt5) {
58+
// GPT-5 supports temperature but not topP
59+
return ChatRequestParameters.builder()
60+
.temperature(customChatModel.getTemperature())
61+
.build();
5662
} else {
5763
return ChatRequestParameters.builder()
5864
.temperature(customChatModel.getTemperature())

src/test/java/com/devoxx/genie/chatmodel/cloud/openai/OpenAiChatModelFactoryTest.java

Lines changed: 56 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,10 +9,12 @@
99
import com.intellij.openapi.application.ApplicationManager;
1010
import com.intellij.testFramework.ServiceContainerUtil;
1111
import dev.langchain4j.model.chat.ChatModel;
12+
import dev.langchain4j.model.chat.request.ChatRequestParameters;
1213

1314
import org.junit.jupiter.api.BeforeEach;
1415
import org.junit.jupiter.api.Test;
1516

17+
import java.lang.reflect.Method;
1618
import java.util.List;
1719

1820
import static org.assertj.core.api.Assertions.assertThat;
@@ -67,6 +69,60 @@ void getModels() {
6769
assertThat(models).size().isGreaterThan(7);
6870
}
6971

72+
@Test
73+
void createChatModel_gpt5_omitsTopP() throws Exception {
74+
OpenAIChatModelFactory factory = new OpenAIChatModelFactory();
75+
76+
Method method = OpenAIChatModelFactory.class.getDeclaredMethod("createChatContextParameters", CustomChatModel.class);
77+
method.setAccessible(true);
78+
79+
CustomChatModel customChatModel = new CustomChatModel();
80+
customChatModel.setModelName("gpt-5");
81+
customChatModel.setTemperature(0.7);
82+
customChatModel.setTopP(0.9);
83+
84+
ChatRequestParameters params = (ChatRequestParameters) method.invoke(factory, customChatModel);
85+
86+
assertThat(params.temperature()).isEqualTo(0.7);
87+
assertThat(params.topP()).isNull();
88+
}
89+
90+
@Test
91+
void createChatModel_o1_omitsTemperatureAndTopP() throws Exception {
92+
OpenAIChatModelFactory factory = new OpenAIChatModelFactory();
93+
94+
Method method = OpenAIChatModelFactory.class.getDeclaredMethod("createChatContextParameters", CustomChatModel.class);
95+
method.setAccessible(true);
96+
97+
CustomChatModel customChatModel = new CustomChatModel();
98+
customChatModel.setModelName("o1");
99+
customChatModel.setTemperature(0.7);
100+
customChatModel.setTopP(0.9);
101+
102+
ChatRequestParameters params = (ChatRequestParameters) method.invoke(factory, customChatModel);
103+
104+
assertThat(params.temperature()).isNull();
105+
assertThat(params.topP()).isNull();
106+
}
107+
108+
@Test
109+
void createChatModel_gpt4_includesTopP() throws Exception {
110+
OpenAIChatModelFactory factory = new OpenAIChatModelFactory();
111+
112+
Method method = OpenAIChatModelFactory.class.getDeclaredMethod("createChatContextParameters", CustomChatModel.class);
113+
method.setAccessible(true);
114+
115+
CustomChatModel customChatModel = new CustomChatModel();
116+
customChatModel.setModelName("gpt-4o");
117+
customChatModel.setTemperature(0.7);
118+
customChatModel.setTopP(0.9);
119+
120+
ChatRequestParameters params = (ChatRequestParameters) method.invoke(factory, customChatModel);
121+
122+
assertThat(params.temperature()).isEqualTo(0.7);
123+
assertThat(params.topP()).isEqualTo(0.9);
124+
}
125+
70126
private static LanguageModel model(String modelName) {
71127
return LanguageModel.builder()
72128
.provider(ModelProvider.OpenAI)

0 commit comments

Comments
 (0)