Problem
The Smoke Codex workflow fails because the Codex CLI (v1.2.14, model gpt-5.3-codex) now uses the OpenAI Responses API (/responses endpoint) instead of the legacy /v1/chat/completions endpoint. The AWF API proxy sidecar (containers/api-proxy/server.js) does not handle this path, returning a 404 Not Found.
Error from CI logs
ERROR: unexpected status 404 Not Found: Unknown error,
url: http://172.30.0.30:10000/responses,
cf-ray: 9ee6133eda2c6ccf-IAD,
request id: 0b23c169-0f4d-41e6-ae0c-1312e5b144e9
Codex first tries WebSocket connections to ws://172.30.0.30:10000/responses, which fail with reconnection attempts (5/5 twice), then falls back to HTTP POST to the same path and gets 404.
Affected workflow
Root cause
The API proxy sidecar (containers/api-proxy/server.js) currently proxies requests to OpenAI at port 10000, but only handles known paths (likely /v1/*). The Codex CLI now sends requests to /responses (the OpenAI Responses API), which the proxy does not recognize and returns 404.
Additionally, Codex attempts WebSocket upgrades to /responses, which the proxy does not support at all.
Proposed solution
Update containers/api-proxy/server.js to:
- Proxy
/responses HTTP requests to api.openai.com/responses on port 10000 (OpenAI), injecting the API key as with existing paths
- Support WebSocket upgrade for
/responses on port 10000, proxying WebSocket connections to wss://api.openai.com/responses with the API key injected in headers
- Consider a catch-all proxy approach for the OpenAI port (10000) that forwards any path to
api.openai.com with auth injection, rather than whitelisting specific paths — this would prevent future breakage when OpenAI adds new API endpoints
Additional context
- Codex CLI logs show
auth_header_attached=false and auth_env_openai_api_key_present=false — confirming the proxy is expected to inject credentials
- The WebSocket transport is
responses_websocket and the HTTP transport is responses_http
- This is blocking the Smoke Codex workflow on all PRs
Problem
The Smoke Codex workflow fails because the Codex CLI (v1.2.14, model
gpt-5.3-codex) now uses the OpenAI Responses API (/responsesendpoint) instead of the legacy/v1/chat/completionsendpoint. The AWF API proxy sidecar (containers/api-proxy/server.js) does not handle this path, returning a 404 Not Found.Error from CI logs
Codex first tries WebSocket connections to
ws://172.30.0.30:10000/responses, which fail with reconnection attempts (5/5 twice), then falls back to HTTP POST to the same path and gets 404.Affected workflow
Root cause
The API proxy sidecar (
containers/api-proxy/server.js) currently proxies requests to OpenAI at port 10000, but only handles known paths (likely/v1/*). The Codex CLI now sends requests to/responses(the OpenAI Responses API), which the proxy does not recognize and returns 404.Additionally, Codex attempts WebSocket upgrades to
/responses, which the proxy does not support at all.Proposed solution
Update
containers/api-proxy/server.jsto:/responsesHTTP requests toapi.openai.com/responseson port 10000 (OpenAI), injecting the API key as with existing paths/responseson port 10000, proxying WebSocket connections towss://api.openai.com/responseswith the API key injected in headersapi.openai.comwith auth injection, rather than whitelisting specific paths — this would prevent future breakage when OpenAI adds new API endpointsAdditional context
auth_header_attached=falseandauth_env_openai_api_key_present=false— confirming the proxy is expected to inject credentialsresponses_websocketand the HTTP transport isresponses_http