diff --git a/.github/prompts/00-base-contract.md b/.github/prompts/00-base-contract.md index 990aefa63..fe5e02ead 100644 --- a/.github/prompts/00-base-contract.md +++ b/.github/prompts/00-base-contract.md @@ -36,71 +36,37 @@ Before producing any analysis or article content, the agent MUST have read: No article sentence may be drafted until every required analysis artifact exists on disk and the gate in `05-analysis-gate.md` reports pass. -## Pipeline (fixed order) +## Two-run pipeline (primary model) +Every run selects one of two modes automatically — see `03-data-download.md §Pre-flight`: + +**Run 1 — Analysis** (when `$ANALYSIS_DIR` is missing or incomplete): ``` -Download → Read methodology → Read templates → Analysis Pass 1 → Analysis Pass 2 → -Analysis Gate → Article (if applicable) → Stage → Commit → ONE create_pull_request +MCP pre-warm → Download → Read methodology → Read templates → +Analysis Pass 1 → Pass 1 snapshot → Analysis Pass 2 → Analysis Gate → +Stage analysis → Commit → ONE create_pull_request (analysis-only) ``` -No step may be skipped, reordered, or executed in parallel with its successor. - -## Phase checkpoint — persist every phase to repo memory - -Valuable analysis must never be lost. After each pipeline phase completes, snapshot its output to the gh-aw repo-memory mount at `$GH_AW_MEMORY_DIR` (runtime default `/tmp/gh-aw/repo-memory/default`). gh-aw pushes that directory to the `memory/news-generation` branch in a **separate post-job** — so checkpoints survive even if the content PR job fails, crashes, or times out. - -### Mandatory checkpoint points - -| After phase | Phase label | Source(s) | -|-------------|-------------|-----------| -| 03 Data download | `phase-03-download` | `$ANALYSIS_DIR` (manifest + fetched data summaries) | -| 04 Analysis Pass 1 | `phase-04-pass1` | `$ANALYSIS_DIR` top-level artifacts | -| 04 Analysis Pass 2 | `phase-04-pass2` | `$ANALYSIS_DIR` top-level artifacts | -| 05 Gate pass | `phase-05-gate` | `$ANALYSIS_DIR` top-level artifacts | -| 06 Article generated | `phase-06-article` | `$ANALYSIS_DIR` + today's `news/${ARTICLE_DATE}-*.html` | -| 07 Immediately before `create_pull_request` | `phase-07-final` | `$ANALYSIS_DIR` + articles from `news/${ARTICLE_DATE}-*.html` | -| `news-translate` per batch | `phase-translate-` | Translated `news/${ARTICLE_DATE}-*.html` | - -Each checkpoint is mandatory. Skipping them forfeits the only cross-run safety net for analysis work. - -### Reusable snippet - -Run this bash block at the end of every phase (pass the phase label as `$1`). Article HTML is written directly under the flat `news/` directory, so checkpoint copies must use `news/${ARTICLE_DATE}-*.html` rather than `news/$YYYY/$MM/$DD/*.html`: - -```bash -set -Eeuo pipefail -: "${GH_AW_MEMORY_DIR:=/tmp/gh-aw/repo-memory/default}" -: "${ARTICLE_DATE:?ARTICLE_DATE required for checkpoint}" -: "${SUBFOLDER:?SUBFOLDER required for checkpoint (use batch/ for news-translate)}" -PHASE="${1:?phase label required, e.g. phase-04-pass1}" -ANALYSIS_DIR="${ANALYSIS_DIR:-analysis/daily/$ARTICLE_DATE/$SUBFOLDER}" -DEST="$GH_AW_MEMORY_DIR/$ARTICLE_DATE/$SUBFOLDER/$PHASE" -mkdir -p "$DEST" 2>/dev/null || { echo "[checkpoint] mkdir failed for $DEST — continuing"; exit 0; } -# Snapshot top-level analysis artifacts (never documents/ — often 100+ files — and never pass1/). -if [ -d "$ANALYSIS_DIR" ]; then - find "$ANALYSIS_DIR" -maxdepth 1 -type f \( -name '*.md' -o -name '*.json' \) \ - -exec cp -f {} "$DEST"/ \; 2>/dev/null || true -fi -# Snapshot today's produced article HTML from the flat news/ directory (if any exists at this phase). -if [ -d "news" ]; then - find "news" -maxdepth 1 -type f -name "${ARTICLE_DATE}-*.html" \ - -exec cp -f {} "$DEST"/ \; 2>/dev/null || true -fi -COUNT="$(find "$DEST" -maxdepth 1 -type f 2>/dev/null | wc -l | tr -d ' ')" -echo "[checkpoint] $PHASE → $DEST ($COUNT files)" -exit 0 +**Run 2 — Articles** (when `$ANALYSIS_DIR` already contains all 9 core artifacts): +``` +MCP pre-warm → Detect existing analysis → Read all artifacts into context → +Optionally check for new data → Article Pass 1 → Article Pass 2 → +Stage articles → Commit → ONE create_pull_request (articles) ``` -### Checkpoint rules +No step may be skipped within a run. Runs must not overlap for the same `$ARTICLE_DATE` + `$SUBFOLDER`. + +Same-day re-runs always use the same `$ANALYSIS_DIR` folder — never create a parallel folder for the same date + type combination unless `force_generation=true`. + +## Session keepalive requirement + +> ⚠️ **Critical**: The Copilot API creates a server-side session when the agent starts. That session is bound to the `github.token` baked in at step start — it is **never refreshed** mid-run. The session expires at approximately **60 minutes** (gh-aw issue #24920). After expiry, all tool calls and inference requests fail silently. The workflow appears to run but makes zero progress, and **the PR is never created**. + +To mitigate MCP idle-connection drops, workflows set `sandbox.mcp.keepalive-interval: 300` (5-minute ping). This keeps MCP connections alive but does **not** refresh the Copilot API token. + +**The reliable mitigation is to ensure `safeoutputs___create_pull_request` is called well before the session approaches expiry.** Plan the run so the PR is created before the agent passes ~45 minutes of work — that leaves ~10 minutes of safety margin on the 55-minute `timeout-minutes` cap and ~15 minutes on the ~60-minute token window for staging and safe-outputs publishing. See `07-commit-and-pr.md §Deadline enforcement` for the mandatory PR-timing procedure. -| Rule | Rationale | -|------|-----------| -| **Never block on checkpoint failure** — always `exit 0`. | Repo-memory is a safety net, not a gate. | -| Do **not** copy `$ANALYSIS_DIR/documents/` or `$ANALYSIS_DIR/pass1/`. | `documents/` exceeds the 50-file push cap; `pass1/` is local gate evidence only. | -| Do **not** stage or commit anything under `$GH_AW_MEMORY_DIR`. | gh-aw's `push_repo_memory` post-job publishes it; see `07-commit-and-pr.md`. | -| Prefer small summary `.md` / `.json` files (≤ 50 KB each, ≤ 50 per push). | gh-aw silently drops files exceeding the push caps. | -| Re-run the snippet at every phase, even if earlier phases already snapshotted — it overwrites with the latest content. | Ensures the final state is always preserved, and earlier snapshots remain on the branch from prior runs. | -| For `news-translate`, use `SUBFOLDER=batch/` so memory paths don't collide with analysis runs. | Keeps the branch organised by article type. | +Do not add per-phase checkpoint PRs or repo-memory push steps. ## Output contract diff --git a/.github/prompts/02-mcp-access.md b/.github/prompts/02-mcp-access.md index a41c0e682..fc0772ee7 100644 --- a/.github/prompts/02-mcp-access.md +++ b/.github/prompts/02-mcp-access.md @@ -4,7 +4,7 @@ Authoritative per-workflow surface: the `mcp-servers:` + `tools:` blocks in that ## Servers & tool naming -News workflows declare three data MCP servers + the built-in `github` toolset (via `tools.github.toolsets: [all]`) + `bash` + `agentic-workflows` + `repo-memory`. +News workflows declare three data MCP servers + the built-in `github` toolset (via `tools.github.toolsets: [all]`) + `bash` + `agentic-workflows`. | Server | Transport | Declared in | Tool-name style | Example tools | |--------|-----------|-------------|-----------------|---------------| @@ -12,7 +12,6 @@ News workflows declare three data MCP servers + the built-in `github` toolset (v | `scb` | container (`@jarib/pxweb-mcp`) | workflow `mcp-servers:` | `snake_case` | `search_tables`, `get_table_info`, `query_table` | | `world-bank` | container (`worldbank-mcp`) | workflow `mcp-servers:` | `kebab-case` | `get-economic-data`, `get-country-info`, `search-indicators` | | `github` | HTTP (Copilot MCP) | workflow `tools.github` | standard | full GitHub MCP toolset | -| `repo-memory` | local helper | workflow `tools.repo-memory` | standard | persistent cross-run memory on `memory/news-generation` | | `bash` | local helper | workflow `tools.bash` | standard | shell execution | | `safeoutputs` | runner | always available | `snake_case` | `safeoutputs___create_pull_request`, `safeoutputs___noop`, `safeoutputs___dispatch_workflow` | @@ -42,4 +41,4 @@ Run once at workflow start, then proceed — do not loop forever. ## Pre-warm step (CI job, not prompt) -Every news workflow declares a **single** `curl`-based pre-warm step with ≤ 6 retries, ≤ 20 s apart. With `curl --max-time 30`, the worst-case runtime can exceed 4 minutes, so this is a best-effort pre-warm rather than a hard ≤ 2 minute guarantee. If a strict 2 minute cap is required, the workflow's `curl` timeout and/or retry policy must be reduced accordingly. No background pingers. The `safeoutputs` session is kept alive by completing work inside its ~30-minute idle window, not by opening interim PRs. +Every news workflow declares a **single** `curl`-based pre-warm step with ≤ 6 retries, ≤ 20 s apart. With `curl --max-time 30`, the worst-case runtime can exceed 4 minutes, so this is a best-effort pre-warm rather than a hard ≤ 2 minute guarantee. If a strict 2 minute cap is required, the workflow's `curl` timeout and/or retry policy must be reduced accordingly. No background pingers. MCP session longevity is maintained via `sandbox.mcp.keepalive-interval: 300`. diff --git a/.github/prompts/03-data-download.md b/.github/prompts/03-data-download.md index 20c157f1d..5b1d06fd5 100644 --- a/.github/prompts/03-data-download.md +++ b/.github/prompts/03-data-download.md @@ -1,5 +1,43 @@ # 03 — Data Download +## Pre-flight: existing analysis check + +Run this check as the **first action** after MCP pre-warm, before any download: + +```bash +ANALYSIS_DIR="analysis/daily/$ARTICLE_DATE/$SUBFOLDER" + +# 9 core artifacts required by every workflow +REQ=(synthesis-summary.md swot-analysis.md risk-assessment.md threat-analysis.md \ + stakeholder-perspectives.md significance-scoring.md classification-results.md \ + cross-reference-map.md data-download-manifest.md) + +# Tier-C workflows require 5 additional artifacts (evening-analysis, week-ahead, +# month-ahead, weekly-review, monthly-review, realtime-*, deep-inspection). +# See ext/tier-c-aggregation.md for the full list. +case "$SUBFOLDER" in + evening-analysis|week-ahead|month-ahead|weekly-review|monthly-review|deep-inspection|realtime-*) + REQ+=(README.md executive-brief.md scenario-analysis.md \ + comparative-international.md methodology-reflection.md) + ;; +esac + +SKIP_ANALYSIS=false +ALL_PRESENT=true +for f in "${REQ[@]}"; do + [ -s "$ANALYSIS_DIR/$f" ] || { ALL_PRESENT=false; break; } +done +[ "$ALL_PRESENT" = "true" ] && SKIP_ANALYSIS=true +echo "SKIP_ANALYSIS=$SKIP_ANALYSIS (required artifacts present: $ALL_PRESENT, count: ${#REQ[@]})" +``` + +| `SKIP_ANALYSIS` | Mode | Next step | +|-----------------|------|-----------| +| `false` | **Analysis mode** | Continue with download pipeline below → `04-analysis-pipeline.md` → analysis-only PR (see `07-commit-and-pr.md`). Do **not** generate articles in this run. | +| `true` | **Article mode** | Skip the entire download pipeline and `04-analysis-pipeline.md`. Proceed directly to `06-article-generation.md`. Optionally re-query the API and compare against `data-download-manifest.md`; add only genuinely new `dok_id` entries found since the analysis ran. | + +> **Folder reuse rule**: the same `$ANALYSIS_DIR` is always reused across runs for the same `$ARTICLE_DATE` + `$SUBFOLDER` when `force_generation=false`. The legacy auto-suffix behaviour (`propositions-2`, `propositions-3`, …) is retained **only** as an explicit escape hatch when `force_generation=true`, so that a forced rerun on a merged day can produce a fresh parallel analysis without trampling the existing one. + ## Goal Populate `analysis/daily/$ARTICLE_DATE/$SUBFOLDER/` with raw Riksdag/Regering data and a provenance manifest **before** any analysis starts. @@ -20,7 +58,7 @@ Populate `analysis/daily/$ARTICLE_DATE/$SUBFOLDER/` with raw Riksdag/Regering da | news-realtime-monitor | `realtime-$HHMM` | | news-article-generator (`deep-inspection`) | `deep-inspection` | -If the base subfolder already contains `synthesis-summary.md` from a prior merged run **and** `force_generation=false`, auto-suffix: `propositions-2`, `propositions-3`, … +If `force_generation=true` is supplied on a day whose base subfolder already contains `synthesis-summary.md` from a prior merged run, auto-suffix the subfolder (`propositions-2`, `propositions-3`, …) so the forced rerun does not overwrite the merged analysis. Under the default `force_generation=false`, the same base subfolder is reused across runs — see §Pre-flight above. ## Download pipeline diff --git a/.github/prompts/04-analysis-pipeline.md b/.github/prompts/04-analysis-pipeline.md index a05332174..f756aae8b 100644 --- a/.github/prompts/04-analysis-pipeline.md +++ b/.github/prompts/04-analysis-pipeline.md @@ -36,6 +36,8 @@ Plus `documents/` subfolder with **one `{dok_id}-analysis.md` file per `dok_id`* ## Execution order +> **Fast-path**: If `SKIP_ANALYSIS=true` (set by `03-data-download.md §Pre-flight`), skip all steps 1–5 below and proceed directly to `06-article-generation.md`. The full analysis already exists on disk from a prior run — do not re-run downloads, Pass 1, Pass 2, or the gate. + 1. **Read all 6 methodologies first** (one tool call per file, do not skip). 2. **Read all 8 templates first.** 3. **Pass 1 — Create** all 9 artifacts + every per-document file. Minimum 15 minutes of real work. diff --git a/.github/prompts/07-commit-and-pr.md b/.github/prompts/07-commit-and-pr.md index 661282471..335c675f8 100644 --- a/.github/prompts/07-commit-and-pr.md +++ b/.github/prompts/07-commit-and-pr.md @@ -10,6 +10,17 @@ Workflows declare `safe-outputs.create-pull-request.max: 1`. Attempting a second call is a workflow error. +## Two-run PR strategy + +| Run mode | What to commit | PR title prefix | Labels | After PR | +|----------|---------------|-----------------|--------|----------| +| **Analysis mode** (`SKIP_ANALYSIS=false`) | `analysis/daily/$ARTICLE_DATE/$SUBFOLDER/*.md` + `*.json` (never `pass1/`) | `📊 Analysis — ` | `analysis-only` + article-type | **Stop.** Do NOT generate articles. The next scheduled run will detect the analysis and enter Article mode automatically. | +| **Article mode** (`SKIP_ANALYSIS=true`) | `news/$YYYY/$MM/$DD/$SLUG.{en,sv}.html` + chart JSON | `📰 ` | `agentic-news` + article-type | Dispatch `news-translate` for 12 remaining languages. | + +In **Analysis mode**: commit analysis artifacts, create the `analysis-only` PR, then exit. Zero articles are generated in this run. The analysis stays in the `$ANALYSIS_DIR` folder; the next run of this workflow for the same `$ARTICLE_DATE` will find it and proceed directly to articles. + +In **Article mode**: generate articles from existing analysis, commit, and create the articles PR. + ## Stage → commit → PR 1. **Stage scoped files only.** Never stage the whole repo. @@ -21,8 +32,6 @@ Workflows declare `safe-outputs.create-pull-request.max: 1`. Attempting a second | Articles (core languages) | `news/$YYYY/$MM/$DD/$SLUG.{en,sv}.html` | | Translations (news-translate only) | `news/$YYYY/$MM/$DD/$SLUG..html` | - Repo-memory persistence is handled separately by `tools.repo-memory` and pushed to the `memory/news-generation` branch by the safe-outputs runner job. **Do not** create, stage, or commit any `memory/news-generation/*.json` files in the content PR — there is no `memory/` directory in the working tree of `main`. - Never stage `analysis/daily/$ARTICLE_DATE/$SUBFOLDER/documents/` wholesale — it often contains 100+ files. Stage only `documents/*.md` **if** your `documents/` stays under the safe-outputs 100-file cap; otherwise stage only summary files. Never stage `analysis/daily/$ARTICLE_DATE/$SUBFOLDER/pass1/` — it is a local gate-evidence snapshot (see `04-analysis-pipeline.md`), not a deliverable. 2. **100-file guard.** Before calling safeoutputs, count staged files. If the count > 99, unstage everything under `documents/` except `synthesis-summary.md` and re-check. @@ -89,19 +98,22 @@ Call `safeoutputs___noop({"message": ""})` **only** if: In every other case, commit whatever exists and call `create_pull_request` once. -## Final checkpoint — before the PR call +## Deadline enforcement -Immediately before calling `safeoutputs___create_pull_request`, run the **phase checkpoint** from `00-base-contract.md` with label `phase-07-final`. This snapshots the final authoritative analysis + article state to repo memory, so even if the PR call, the safe-outputs runner, or the post-job push fails, the last good state survives on the `memory/news-generation` branch. +> **Root cause**: The Copilot API session is bound to the `github.token` baked in at step start. That token expires at approximately **60 minutes** and is never refreshed mid-run (gh-aw issue #24920). Every tool call and inference request fails silently after that point — the agent appears to run but makes no progress and the PR is never created. Setup steps consume ~5 minutes, so the agent has at most **~55 minutes** of usable session time, and safe-outputs publishing needs several minutes on top. -For `news-translate`, run the checkpoint with label `phase-translate-` after each per-language batch succeeds (before the final PR call), so individual language translations are preserved even if later languages fail. +The target PR-creation window depends on which mode the run is in (see `03-data-download.md §Pre-flight`): -## Deadline enforcement +| Mode | Target PR window | Hard deadline | +|------|------------------|---------------| +| Run 1 — Analysis | 40–45 min after agent start | **48 min** | +| Run 2 — Articles | 20–25 min after agent start | **30 min** | -If the run exceeds 40 minutes with no safe-output call yet: +**If the run exceeds its hard deadline with no safe-output call yet:** 1. Stop analysis / article work immediately. -2. Stage whatever exists on disk. -3. Commit. +2. Stage whatever exists on disk (analysis artifacts and/or partial articles). +3. Commit with message including `[early-pr]` to signal partial content. 4. Call `safeoutputs___create_pull_request` with label `analysis-only` if articles are incomplete. -Do not attempt to "save" work via a second PR — there is no second PR. +Do not attempt to "save" work via a second PR — there is no second PR. Creating the PR early is always better than losing all work to a token expiry. The hard deadlines above leave ~7 minutes of margin on the 55-minute `timeout-minutes` cap for staging and safe-outputs publishing before the ~60-minute Copilot API token expiry. diff --git a/.github/workflows/news-article-generator.lock.yml b/.github/workflows/news-article-generator.lock.yml index 3a44cb32e..4bf89a2fe 100644 --- a/.github/workflows/news-article-generator.lock.yml +++ b/.github/workflows/news-article-generator.lock.yml @@ -1,5 +1,5 @@ -# gh-aw-metadata: {"schema_version":"v3","frontmatter_hash":"4390716293e06a2a234a6472a5cc3a514d2dab3fb7f9e819765e27068ea9148e","compiler_version":"v0.69.3","agent_id":"copilot","agent_model":"claude-opus-4.7"} -# gh-aw-manifest: {"version":1,"secrets":["COPILOT_GITHUB_TOKEN","GH_AW_CI_TRIGGER_TOKEN","GH_AW_GITHUB_MCP_SERVER_TOKEN","GH_AW_GITHUB_TOKEN","GITHUB_TOKEN"],"actions":[{"repo":"actions/checkout","sha":"de0fac2e4500dabe0009e67214ff5f5447ce83dd","version":"v6.0.2"},{"repo":"actions/download-artifact","sha":"3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c","version":"v8.0.1"},{"repo":"actions/github-script","sha":"373c709c69115d41ff229c7e5df9f8788daa9553","version":"v9"},{"repo":"actions/setup-node","sha":"6044e13b5dc448c55e2357c09f80417699197238","version":"6044e13b5dc448c55e2357c09f80417699197238"},{"repo":"actions/upload-artifact","sha":"043fb46d1a93c77aae656e7c1c64a875d1fc6a0a","version":"v7.0.1"},{"repo":"github/gh-aw-actions/setup","sha":"006ffd856b868b71df342dbe0ba082a963249b31","version":"v0.69.3"}],"containers":[{"image":"alpine:latest"},{"image":"ghcr.io/github/gh-aw-firewall/agent:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/api-proxy:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/squid:0.25.26"},{"image":"ghcr.io/github/gh-aw-mcpg:v0.2.26"},{"image":"ghcr.io/github/github-mcp-server:v1.0.0"},{"image":"mcr.microsoft.com/playwright/mcp"},{"image":"node:25-alpine"},{"image":"node:lts-alpine"}]} +# gh-aw-metadata: {"schema_version":"v3","frontmatter_hash":"2088ca769c92d4e75058035e97ee389d868dba1234d61fb631604fc7b7d88484","compiler_version":"v0.69.3","agent_id":"copilot","agent_model":"claude-opus-4.7"} +# gh-aw-manifest: {"version":1,"secrets":["COPILOT_GITHUB_TOKEN","GH_AW_CI_TRIGGER_TOKEN","GH_AW_GITHUB_MCP_SERVER_TOKEN","GH_AW_GITHUB_TOKEN","GITHUB_TOKEN"],"actions":[{"repo":"actions/checkout","sha":"de0fac2e4500dabe0009e67214ff5f5447ce83dd","version":"v6.0.2"},{"repo":"actions/download-artifact","sha":"3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c","version":"v8.0.1"},{"repo":"actions/github-script","sha":"373c709c69115d41ff229c7e5df9f8788daa9553","version":"v9"},{"repo":"actions/setup-node","sha":"6044e13b5dc448c55e2357c09f80417699197238","version":"6044e13b5dc448c55e2357c09f80417699197238"},{"repo":"actions/upload-artifact","sha":"043fb46d1a93c77aae656e7c1c64a875d1fc6a0a","version":"v7.0.1"},{"repo":"github/gh-aw-actions/setup","sha":"006ffd856b868b71df342dbe0ba082a963249b31","version":"v0.69.3"}],"containers":[{"image":"alpine:latest"},{"image":"ghcr.io/github/gh-aw-firewall/agent:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/api-proxy:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/squid:0.25.26"},{"image":"ghcr.io/github/gh-aw-mcpg:v0.2.26"},{"image":"ghcr.io/github/github-mcp-server:v1.0.0"},{"image":"node:25-alpine"},{"image":"node:lts-alpine"}]} # ___ _ _ # / _ \ | | (_) # | |_| | __ _ ___ _ __ | |_ _ ___ @@ -58,7 +58,6 @@ # - ghcr.io/github/gh-aw-firewall/squid:0.25.26 # - ghcr.io/github/gh-aw-mcpg:v0.2.26 # - ghcr.io/github/github-mcp-server:v1.0.0 -# - mcr.microsoft.com/playwright/mcp # - node:25-alpine # - node:lts-alpine @@ -211,27 +210,24 @@ jobs: GH_AW_GITHUB_REPOSITORY: ${{ github.repository }} GH_AW_GITHUB_RUN_ID: ${{ github.run_id }} GH_AW_GITHUB_WORKSPACE: ${{ github.workspace }} - GH_AW_WIKI_NOTE: ${{ '' }} # poutine:ignore untrusted_checkout_exec run: | bash "${RUNNER_TEMP}/gh-aw/actions/create_prompt_first.sh" { - cat << 'GH_AW_PROMPT_0594bf2f1e4261d5_EOF' + cat << 'GH_AW_PROMPT_66520513f561422b_EOF' - GH_AW_PROMPT_0594bf2f1e4261d5_EOF + GH_AW_PROMPT_66520513f561422b_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/xpia.md" cat "${RUNNER_TEMP}/gh-aw/prompts/temp_folder_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/markdown.md" - cat "${RUNNER_TEMP}/gh-aw/prompts/playwright_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/agentic_workflows_guide.md" - cat "${RUNNER_TEMP}/gh-aw/prompts/repo_memory_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/safe_outputs_prompt.md" - cat << 'GH_AW_PROMPT_0594bf2f1e4261d5_EOF' + cat << 'GH_AW_PROMPT_66520513f561422b_EOF' Tools: add_comment, create_pull_request, dispatch_workflow, missing_tool, missing_data, noop - GH_AW_PROMPT_0594bf2f1e4261d5_EOF + GH_AW_PROMPT_66520513f561422b_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/safe_outputs_create_pull_request.md" - cat << 'GH_AW_PROMPT_0594bf2f1e4261d5_EOF' + cat << 'GH_AW_PROMPT_66520513f561422b_EOF' The following GitHub context information is available for this workflow: @@ -261,9 +257,9 @@ jobs: {{/if}} - GH_AW_PROMPT_0594bf2f1e4261d5_EOF + GH_AW_PROMPT_66520513f561422b_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/github_mcp_tools_with_safeoutputs_prompt.md" - cat << 'GH_AW_PROMPT_0594bf2f1e4261d5_EOF' + cat << 'GH_AW_PROMPT_66520513f561422b_EOF' {{#runtime-import .github/prompts/00-base-contract.md}} {{#runtime-import .github/prompts/01-bash-and-shell-safety.md}} @@ -275,7 +271,7 @@ jobs: {{#runtime-import .github/prompts/07-commit-and-pr.md}} {{#runtime-import .github/prompts/ext/tier-c-aggregation.md}} {{#runtime-import .github/workflows/news-article-generator.md}} - GH_AW_PROMPT_0594bf2f1e4261d5_EOF + GH_AW_PROMPT_66520513f561422b_EOF } > "$GH_AW_PROMPT" - name: Interpolate variables and render templates uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 @@ -299,12 +295,6 @@ jobs: GH_AW_GITHUB_REPOSITORY: ${{ github.repository }} GH_AW_GITHUB_RUN_ID: ${{ github.run_id }} GH_AW_GITHUB_WORKSPACE: ${{ github.workspace }} - GH_AW_MEMORY_BRANCH_NAME: 'memory/news-generation' - GH_AW_MEMORY_CONSTRAINTS: "\n\n**Constraints:**\n- **Max File Size**: 51200 bytes (0.05 MB) per file\n- **Max File Count**: 50 files per commit\n- **Max Patch Size**: 51200 bytes (50 KB) total per push (max: 100 KB)\n" - GH_AW_MEMORY_DESCRIPTION: '' - GH_AW_MEMORY_DIR: '/tmp/gh-aw/repo-memory/default/' - GH_AW_MEMORY_TARGET_REPO: ' of the current repository' - GH_AW_WIKI_NOTE: '' with: script: | const { setupGlobals } = require('${{ runner.temp }}/gh-aw/actions/setup_globals.cjs'); @@ -323,13 +313,7 @@ jobs: GH_AW_GITHUB_EVENT_PULL_REQUEST_NUMBER: process.env.GH_AW_GITHUB_EVENT_PULL_REQUEST_NUMBER, GH_AW_GITHUB_REPOSITORY: process.env.GH_AW_GITHUB_REPOSITORY, GH_AW_GITHUB_RUN_ID: process.env.GH_AW_GITHUB_RUN_ID, - GH_AW_GITHUB_WORKSPACE: process.env.GH_AW_GITHUB_WORKSPACE, - GH_AW_MEMORY_BRANCH_NAME: process.env.GH_AW_MEMORY_BRANCH_NAME, - GH_AW_MEMORY_CONSTRAINTS: process.env.GH_AW_MEMORY_CONSTRAINTS, - GH_AW_MEMORY_DESCRIPTION: process.env.GH_AW_MEMORY_DESCRIPTION, - GH_AW_MEMORY_DIR: process.env.GH_AW_MEMORY_DIR, - GH_AW_MEMORY_TARGET_REPO: process.env.GH_AW_MEMORY_TARGET_REPO, - GH_AW_WIKI_NOTE: process.env.GH_AW_WIKI_NOTE + GH_AW_GITHUB_WORKSPACE: process.env.GH_AW_GITHUB_WORKSPACE } }); - name: Validate prompt placeholders @@ -422,16 +406,6 @@ jobs: - name: Pre-flight external endpoint reachability check (runs before MCP Gateway) run: "echo \"🔍 Network Diagnostics — $(date -u '+%Y-%m-%dT%H:%M:%SZ')\"\necho \"═══════════════════════════════════════════\"\necho \"\"\necho \"📡 DNS Resolution Tests:\"\nfor domain in riksdag-regering-ai.onrender.com api.scb.se api.worldbank.org data.riksdagen.se www.riksdagen.se www.regeringen.se; do\n if nslookup \"$domain\" >/dev/null 2>&1; then\n IP=$(nslookup \"$domain\" 2>/dev/null | grep -A1 \"Name:\" | grep \"Address:\" | head -1 | awk '{print $2}')\n echo \" ✅ $domain → $IP\"\n else\n echo \" ❌ $domain — DNS FAILED\"\n fi\ndone\necho \"\"\necho \"🌐 HTTPS Connectivity Tests:\"\nfor url in \\\n \"https://riksdag-regering-ai.onrender.com/mcp\" \\\n \"https://api.scb.se/OV0104/v2beta\" \\\n \"https://api.worldbank.org/v2/country/SE?format=json\" \\\n \"https://data.riksdagen.se/dokumentlista/?sok=test&doktyp=bet&utformat=json&a=1\" \\\n; do\n HTTP_CODE=$(curl -s -o /dev/null -w \"%{http_code}\" --max-time 10 \"$url\" 2>/dev/null || echo \"000\")\n DOMAIN=$(echo \"$url\" | sed 's|https://||' | cut -d/ -f1)\n if [ \"$HTTP_CODE\" -ge 200 ] && [ \"$HTTP_CODE\" -lt 400 ]; then\n echo \" ✅ $DOMAIN → HTTP $HTTP_CODE\"\n elif [ \"$HTTP_CODE\" = \"000\" ]; then\n echo \" ❌ $DOMAIN → TIMEOUT/UNREACHABLE\"\n else\n echo \" ⚠️ $DOMAIN → HTTP $HTTP_CODE\"\n fi\ndone\necho \"\"\necho \"🔌 MCP Server Tool Count:\"\nTOOL_RESP=$(curl -sf --max-time 15 -X POST \\\n -H \"Content-Type: application/json\" \\\n -d '{\"jsonrpc\":\"2.0\",\"id\":1,\"method\":\"tools/list\",\"params\":{}}' \\\n \"https://riksdag-regering-ai.onrender.com/mcp\" 2>/dev/null) || TOOL_RESP=\"\"\nif echo \"$TOOL_RESP\" | grep -q '\"tools\"'; then\n TOOL_COUNT=$(echo \"$TOOL_RESP\" | grep -o '\"name\"' | wc -l)\n echo \" ✅ riksdag-regering MCP: $TOOL_COUNT tools registered\"\nelse\n echo \" ❌ riksdag-regering MCP: No tools response (server may still be starting)\"\nfi\necho \"\"\necho \"═══════════════════════════════════════════\"\n" - # Repo memory git-based storage configuration from frontmatter processed below - - name: Clone repo-memory branch (default) - env: - GH_TOKEN: ${{ github.token }} - GITHUB_SERVER_URL: ${{ github.server_url }} - BRANCH_NAME: memory/news-generation - TARGET_REPO: ${{ github.repository }} - MEMORY_DIR: /tmp/gh-aw/repo-memory/default - CREATE_ORPHAN: true - run: bash "${RUNNER_TEMP}/gh-aw/actions/clone_repo_memory_branch.sh" - name: Configure Git credentials env: REPO_NAME: ${{ github.repository }} @@ -476,7 +450,7 @@ jobs: const determineAutomaticLockdown = require('${{ runner.temp }}/gh-aw/actions/determine_automatic_lockdown.cjs'); await determineAutomaticLockdown(github, context, core); - name: Download container images - run: bash "${RUNNER_TEMP}/gh-aw/actions/download_docker_images.sh" alpine:latest ghcr.io/github/gh-aw-firewall/agent:0.25.26 ghcr.io/github/gh-aw-firewall/api-proxy:0.25.26 ghcr.io/github/gh-aw-firewall/squid:0.25.26 ghcr.io/github/gh-aw-mcpg:v0.2.26 ghcr.io/github/github-mcp-server:v1.0.0 mcr.microsoft.com/playwright/mcp node:25-alpine node:lts-alpine + run: bash "${RUNNER_TEMP}/gh-aw/actions/download_docker_images.sh" alpine:latest ghcr.io/github/gh-aw-firewall/agent:0.25.26 ghcr.io/github/gh-aw-firewall/api-proxy:0.25.26 ghcr.io/github/gh-aw-firewall/squid:0.25.26 ghcr.io/github/gh-aw-mcpg:v0.2.26 ghcr.io/github/github-mcp-server:v1.0.0 node:25-alpine node:lts-alpine - name: Install gh-aw extension env: GH_TOKEN: ${{ secrets.GH_AW_GITHUB_MCP_SERVER_TOKEN || secrets.GH_AW_GITHUB_TOKEN || secrets.GITHUB_TOKEN }} @@ -506,9 +480,9 @@ jobs: mkdir -p "${RUNNER_TEMP}/gh-aw/safeoutputs" mkdir -p /tmp/gh-aw/safeoutputs mkdir -p /tmp/gh-aw/mcp-logs/safeoutputs - cat > "${RUNNER_TEMP}/gh-aw/safeoutputs/config.json" << 'GH_AW_SAFE_OUTPUTS_CONFIG_74b57913640f4749_EOF' - {"add_comment":{"max":1},"create_pull_request":{"draft":false,"expires":336,"labels":["agentic-news","analysis-data"],"max":1,"max_patch_size":4096,"protected_files":["package.json","bun.lockb","bunfig.toml","deno.json","deno.jsonc","deno.lock","global.json","NuGet.Config","Directory.Packages.props","mix.exs","mix.lock","go.mod","go.sum","stack.yaml","stack.yaml.lock","pom.xml","build.gradle","build.gradle.kts","settings.gradle","settings.gradle.kts","gradle.properties","package-lock.json","yarn.lock","pnpm-lock.yaml","npm-shrinkwrap.json","requirements.txt","Pipfile","Pipfile.lock","pyproject.toml","setup.py","setup.cfg","Gemfile","Gemfile.lock","uv.lock","CODEOWNERS","AGENTS.md","CLAUDE.md","GEMINI.md"],"protected_path_prefixes":[".github/",".agents/"]},"create_report_incomplete_issue":{},"dispatch_workflow":{"max":1,"workflow_files":{"news-translate":".lock.yml"},"workflows":["news-translate"]},"missing_data":{},"missing_tool":{},"noop":{"max":1,"report-as-issue":"true"},"push_repo_memory":{"memories":[{"dir":"/tmp/gh-aw/repo-memory/default","id":"default","max_file_count":50,"max_file_size":51200,"max_patch_size":51200}]},"report_incomplete":{}} - GH_AW_SAFE_OUTPUTS_CONFIG_74b57913640f4749_EOF + cat > "${RUNNER_TEMP}/gh-aw/safeoutputs/config.json" << 'GH_AW_SAFE_OUTPUTS_CONFIG_8ec9076300ff1718_EOF' + {"add_comment":{"max":1},"create_pull_request":{"draft":false,"expires":336,"labels":["agentic-news","analysis-data"],"max":1,"max_patch_size":4096,"protected_files":["package.json","bun.lockb","bunfig.toml","deno.json","deno.jsonc","deno.lock","global.json","NuGet.Config","Directory.Packages.props","mix.exs","mix.lock","go.mod","go.sum","stack.yaml","stack.yaml.lock","pom.xml","build.gradle","build.gradle.kts","settings.gradle","settings.gradle.kts","gradle.properties","package-lock.json","yarn.lock","pnpm-lock.yaml","npm-shrinkwrap.json","requirements.txt","Pipfile","Pipfile.lock","pyproject.toml","setup.py","setup.cfg","Gemfile","Gemfile.lock","uv.lock","CODEOWNERS","AGENTS.md","CLAUDE.md","GEMINI.md"],"protected_path_prefixes":[".github/",".agents/"]},"create_report_incomplete_issue":{},"dispatch_workflow":{"max":1,"workflow_files":{"news-translate":".lock.yml"},"workflows":["news-translate"]},"missing_data":{},"missing_tool":{},"noop":{"max":1,"report-as-issue":"true"},"report_incomplete":{}} + GH_AW_SAFE_OUTPUTS_CONFIG_8ec9076300ff1718_EOF - name: Write Safe Outputs Tools env: GH_AW_TOOLS_META_JSON: | @@ -754,8 +728,6 @@ jobs: run: | set -eo pipefail mkdir -p "${RUNNER_TEMP}/gh-aw/mcp-config" - mkdir -p /tmp/gh-aw/mcp-logs/playwright - chmod 777 /tmp/gh-aw/mcp-logs/playwright # Export gateway environment variables for MCP config and gateway script export MCP_GATEWAY_PORT="8080" @@ -776,7 +748,7 @@ jobs: mkdir -p /home/runner/.copilot GH_AW_NODE=$(which node 2>/dev/null || command -v node 2>/dev/null || echo node) - cat << GH_AW_MCP_CONFIG_43773805ac441abf_EOF | "$GH_AW_NODE" "${RUNNER_TEMP}/gh-aw/actions/start_mcp_gateway.cjs" + cat << GH_AW_MCP_CONFIG_c32305902593caa9_EOF | "$GH_AW_NODE" "${RUNNER_TEMP}/gh-aw/actions/start_mcp_gateway.cjs" { "mcpServers": { "agenticworkflows": { @@ -816,20 +788,6 @@ jobs: } } }, - "playwright": { - "type": "stdio", - "container": "mcr.microsoft.com/playwright/mcp", - "args": ["--init", "--network", "host", "--security-opt", "seccomp=unconfined", "--ipc=host"], - "entrypointArgs": ["--output-dir", "/tmp/gh-aw/mcp-logs/playwright", "--no-sandbox"], - "mounts": ["/tmp/gh-aw/mcp-logs:/tmp/gh-aw/mcp-logs:rw"], - "guard-policies": { - "write-sink": { - "accept": [ - "*" - ] - } - } - }, "riksdag-regering": { "type": "http", "url": "https://riksdag-regering-ai.onrender.com/mcp", @@ -903,10 +861,11 @@ jobs: "port": $MCP_GATEWAY_PORT, "domain": "${MCP_GATEWAY_DOMAIN}", "apiKey": "${MCP_GATEWAY_API_KEY}", - "payloadDir": "${MCP_GATEWAY_PAYLOAD_DIR}" + "payloadDir": "${MCP_GATEWAY_PAYLOAD_DIR}", + "keepaliveInterval": 300 } } - GH_AW_MCP_CONFIG_43773805ac441abf_EOF + GH_AW_MCP_CONFIG_c32305902593caa9_EOF - name: Download activation artifact uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1 with: @@ -924,7 +883,7 @@ jobs: - name: Execute GitHub Copilot CLI id: agentic_execution # Copilot CLI tool arguments (sorted): - timeout-minutes: 60 + timeout-minutes: 55 run: | set -o pipefail touch /tmp/gh-aw/agent-step-summary.md @@ -932,7 +891,7 @@ jobs: export GH_AW_NODE_BIN (umask 177 && touch /tmp/gh-aw/agent-stdio.log) # shellcheck disable=SC1003 - sudo -E awf --container-workdir "${GITHUB_WORKSPACE}" --mount "${RUNNER_TEMP}/gh-aw:${RUNNER_TEMP}/gh-aw:ro" --mount "${RUNNER_TEMP}/gh-aw:/host${RUNNER_TEMP}/gh-aw:ro" --env-all --exclude-env COPILOT_GITHUB_TOKEN --exclude-env GITHUB_MCP_SERVER_TOKEN --exclude-env MCP_GATEWAY_API_KEY --allow-domains '*.githubusercontent.com,api.business.githubcopilot.com,api.enterprise.githubcopilot.com,api.github.com,api.githubcopilot.com,api.imf.org,api.individual.githubcopilot.com,api.npms.io,api.scb.se,api.snapcraft.io,api.worldbank.org,archive.ubuntu.com,azure.archive.ubuntu.com,bun.sh,cdn.jsdelivr.net,cdn.playwright.dev,codeload.github.com,crl.geotrust.com,crl.globalsign.com,crl.identrust.com,crl.sectigo.com,crl.thawte.com,crl.usertrust.com,crl.verisign.com,crl3.digicert.com,crl4.digicert.com,crls.ssl.com,data.imf.org,data.riksdagen.se,deb.nodesource.com,deno.land,docs.github.com,esm.sh,get.pnpm.io,github-cloud.githubusercontent.com,github-cloud.s3.amazonaws.com,github.blog,github.com,github.githubassets.com,googleapis.deno.dev,googlechromelabs.github.io,hack23.com,hack23.github.io,host.docker.internal,json-schema.org,json.schemastore.org,jsr.io,keyserver.ubuntu.com,lfs.github.com,nodejs.org,npm.pkg.github.com,npmjs.com,npmjs.org,objects.githubusercontent.com,ocsp.digicert.com,ocsp.geotrust.com,ocsp.globalsign.com,ocsp.identrust.com,ocsp.sectigo.com,ocsp.ssl.com,ocsp.thawte.com,ocsp.usertrust.com,ocsp.verisign.com,packagecloud.io,packages.cloud.google.com,packages.microsoft.com,playwright.download.prss.microsoft.com,ppa.launchpad.net,raw.githubusercontent.com,regeringen.se,registry.bower.io,registry.npmjs.com,registry.npmjs.org,registry.yarnpkg.com,repo.yarnpkg.com,riksdag-regering-ai.onrender.com,riksdagen.se,riksdagsmonitor.com,s.symcb.com,s.symcd.com,security.ubuntu.com,skimdb.npmjs.com,storage.googleapis.com,telemetry.enterprise.githubcopilot.com,telemetry.vercel.com,ts-crl.ws.symantec.com,ts-ocsp.ws.symantec.com,www.googleapis.com,www.hack23.com,www.imf.org,www.npmjs.com,www.npmjs.org,www.regeringen.se,www.riksdagen.se,www.riksdagsmonitor.com,www.scb.se,yarnpkg.com' --log-level info --proxy-logs-dir /tmp/gh-aw/sandbox/firewall/logs --audit-dir /tmp/gh-aw/sandbox/firewall/audit --enable-host-access --allow-host-ports 80,443,8080 --image-tag 0.25.26 --skip-pull --enable-api-proxy \ + sudo -E awf --container-workdir "${GITHUB_WORKSPACE}" --mount "${RUNNER_TEMP}/gh-aw:${RUNNER_TEMP}/gh-aw:ro" --mount "${RUNNER_TEMP}/gh-aw:/host${RUNNER_TEMP}/gh-aw:ro" --env-all --exclude-env COPILOT_GITHUB_TOKEN --exclude-env GITHUB_MCP_SERVER_TOKEN --exclude-env MCP_GATEWAY_API_KEY --allow-domains '*.githubusercontent.com,api.business.githubcopilot.com,api.enterprise.githubcopilot.com,api.github.com,api.githubcopilot.com,api.imf.org,api.individual.githubcopilot.com,api.npms.io,api.scb.se,api.snapcraft.io,api.worldbank.org,archive.ubuntu.com,azure.archive.ubuntu.com,bun.sh,cdn.jsdelivr.net,codeload.github.com,crl.geotrust.com,crl.globalsign.com,crl.identrust.com,crl.sectigo.com,crl.thawte.com,crl.usertrust.com,crl.verisign.com,crl3.digicert.com,crl4.digicert.com,crls.ssl.com,data.imf.org,data.riksdagen.se,deb.nodesource.com,deno.land,docs.github.com,esm.sh,get.pnpm.io,github-cloud.githubusercontent.com,github-cloud.s3.amazonaws.com,github.blog,github.com,github.githubassets.com,googleapis.deno.dev,googlechromelabs.github.io,hack23.com,hack23.github.io,host.docker.internal,json-schema.org,json.schemastore.org,jsr.io,keyserver.ubuntu.com,lfs.github.com,nodejs.org,npm.pkg.github.com,npmjs.com,npmjs.org,objects.githubusercontent.com,ocsp.digicert.com,ocsp.geotrust.com,ocsp.globalsign.com,ocsp.identrust.com,ocsp.sectigo.com,ocsp.ssl.com,ocsp.thawte.com,ocsp.usertrust.com,ocsp.verisign.com,packagecloud.io,packages.cloud.google.com,packages.microsoft.com,ppa.launchpad.net,raw.githubusercontent.com,regeringen.se,registry.bower.io,registry.npmjs.com,registry.npmjs.org,registry.yarnpkg.com,repo.yarnpkg.com,riksdag-regering-ai.onrender.com,riksdagen.se,riksdagsmonitor.com,s.symcb.com,s.symcd.com,security.ubuntu.com,skimdb.npmjs.com,storage.googleapis.com,telemetry.enterprise.githubcopilot.com,telemetry.vercel.com,ts-crl.ws.symantec.com,ts-ocsp.ws.symantec.com,www.googleapis.com,www.hack23.com,www.imf.org,www.npmjs.com,www.npmjs.org,www.regeringen.se,www.riksdagen.se,www.riksdagsmonitor.com,www.scb.se,yarnpkg.com' --log-level info --proxy-logs-dir /tmp/gh-aw/sandbox/firewall/logs --audit-dir /tmp/gh-aw/sandbox/firewall/audit --enable-host-access --allow-host-ports 80,443,8080 --image-tag 0.25.26 --skip-pull --enable-api-proxy \ -- /bin/bash -c 'GH_AW_NODE_EXEC="${GH_AW_NODE_BIN:-}"; if [ -z "$GH_AW_NODE_EXEC" ] || [ ! -x "$GH_AW_NODE_EXEC" ]; then GH_AW_NODE_EXEC="$(command -v node 2>/dev/null || echo node)"; fi; "$GH_AW_NODE_EXEC" ${RUNNER_TEMP}/gh-aw/actions/copilot_driver.cjs /usr/local/bin/copilot --add-dir /tmp/gh-aw/ --log-level all --log-dir /tmp/gh-aw/sandbox/agent/logs/ --disable-builtin-mcps --no-ask-user --allow-all-tools --allow-all-paths --add-dir "${GITHUB_WORKSPACE}" --prompt-file /tmp/gh-aw/aw-prompts/prompt.txt' 2>&1 | tee -a /tmp/gh-aw/agent-stdio.log env: COPILOT_AGENT_RUNNER_TYPE: STANDALONE @@ -1021,7 +980,7 @@ jobs: uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 env: GH_AW_SAFE_OUTPUTS: ${{ steps.set-runtime-paths.outputs.GH_AW_SAFE_OUTPUTS }} - GH_AW_ALLOWED_DOMAINS: "*.githubusercontent.com,api.business.githubcopilot.com,api.enterprise.githubcopilot.com,api.github.com,api.githubcopilot.com,api.imf.org,api.individual.githubcopilot.com,api.npms.io,api.scb.se,api.snapcraft.io,api.worldbank.org,archive.ubuntu.com,azure.archive.ubuntu.com,bun.sh,cdn.jsdelivr.net,cdn.playwright.dev,codeload.github.com,crl.geotrust.com,crl.globalsign.com,crl.identrust.com,crl.sectigo.com,crl.thawte.com,crl.usertrust.com,crl.verisign.com,crl3.digicert.com,crl4.digicert.com,crls.ssl.com,data.imf.org,data.riksdagen.se,deb.nodesource.com,deno.land,docs.github.com,esm.sh,get.pnpm.io,github-cloud.githubusercontent.com,github-cloud.s3.amazonaws.com,github.blog,github.com,github.githubassets.com,googleapis.deno.dev,googlechromelabs.github.io,hack23.com,hack23.github.io,host.docker.internal,json-schema.org,json.schemastore.org,jsr.io,keyserver.ubuntu.com,lfs.github.com,localhost,nodejs.org,npm.pkg.github.com,npmjs.com,npmjs.org,objects.githubusercontent.com,ocsp.digicert.com,ocsp.geotrust.com,ocsp.globalsign.com,ocsp.identrust.com,ocsp.sectigo.com,ocsp.ssl.com,ocsp.thawte.com,ocsp.usertrust.com,ocsp.verisign.com,packagecloud.io,packages.cloud.google.com,packages.microsoft.com,playwright.download.prss.microsoft.com,ppa.launchpad.net,raw.githubusercontent.com,regeringen.se,registry.bower.io,registry.npmjs.com,registry.npmjs.org,registry.yarnpkg.com,repo.yarnpkg.com,riksdag-regering-ai.onrender.com,riksdagen.se,riksdagsmonitor.com,s.symcb.com,s.symcd.com,security.ubuntu.com,skimdb.npmjs.com,storage.googleapis.com,telemetry.enterprise.githubcopilot.com,telemetry.vercel.com,ts-crl.ws.symantec.com,ts-ocsp.ws.symantec.com,www.googleapis.com,www.hack23.com,www.imf.org,www.npmjs.com,www.npmjs.org,www.regeringen.se,www.riksdagen.se,www.riksdagsmonitor.com,www.scb.se,yarnpkg.com" + GH_AW_ALLOWED_DOMAINS: "*.githubusercontent.com,api.business.githubcopilot.com,api.enterprise.githubcopilot.com,api.github.com,api.githubcopilot.com,api.imf.org,api.individual.githubcopilot.com,api.npms.io,api.scb.se,api.snapcraft.io,api.worldbank.org,archive.ubuntu.com,azure.archive.ubuntu.com,bun.sh,cdn.jsdelivr.net,codeload.github.com,crl.geotrust.com,crl.globalsign.com,crl.identrust.com,crl.sectigo.com,crl.thawte.com,crl.usertrust.com,crl.verisign.com,crl3.digicert.com,crl4.digicert.com,crls.ssl.com,data.imf.org,data.riksdagen.se,deb.nodesource.com,deno.land,docs.github.com,esm.sh,get.pnpm.io,github-cloud.githubusercontent.com,github-cloud.s3.amazonaws.com,github.blog,github.com,github.githubassets.com,googleapis.deno.dev,googlechromelabs.github.io,hack23.com,hack23.github.io,host.docker.internal,json-schema.org,json.schemastore.org,jsr.io,keyserver.ubuntu.com,lfs.github.com,localhost,nodejs.org,npm.pkg.github.com,npmjs.com,npmjs.org,objects.githubusercontent.com,ocsp.digicert.com,ocsp.geotrust.com,ocsp.globalsign.com,ocsp.identrust.com,ocsp.sectigo.com,ocsp.ssl.com,ocsp.thawte.com,ocsp.usertrust.com,ocsp.verisign.com,packagecloud.io,packages.cloud.google.com,packages.microsoft.com,ppa.launchpad.net,raw.githubusercontent.com,regeringen.se,registry.bower.io,registry.npmjs.com,registry.npmjs.org,registry.yarnpkg.com,repo.yarnpkg.com,riksdag-regering-ai.onrender.com,riksdagen.se,riksdagsmonitor.com,s.symcb.com,s.symcd.com,security.ubuntu.com,skimdb.npmjs.com,storage.googleapis.com,telemetry.enterprise.githubcopilot.com,telemetry.vercel.com,ts-crl.ws.symantec.com,ts-ocsp.ws.symantec.com,www.googleapis.com,www.hack23.com,www.imf.org,www.npmjs.com,www.npmjs.org,www.regeringen.se,www.riksdagen.se,www.riksdagsmonitor.com,www.scb.se,yarnpkg.com" GITHUB_SERVER_URL: ${{ github.server_url }} GITHUB_API_URL: ${{ github.api_url }} with: @@ -1082,15 +1041,6 @@ jobs: if [ ! -f /tmp/gh-aw/agent_output.json ]; then echo '{"items":[]}' > /tmp/gh-aw/agent_output.json fi - # Upload repo memory as artifacts for push job - - name: Upload repo-memory artifact (default) - if: always() - uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1 - with: - name: repo-memory-default - path: /tmp/gh-aw/repo-memory/default - retention-days: 1 - if-no-files-found: ignore - name: Upload agent artifacts if: always() continue-on-error: true @@ -1119,7 +1069,6 @@ jobs: - activation - agent - detection - - push_repo_memory - safe_outputs if: > always() && (needs.agent.result != 'skipped' || needs.activation.outputs.lockdown_check_failed == 'true' || @@ -1244,13 +1193,9 @@ jobs: GH_AW_CODE_PUSH_FAILURE_COUNT: ${{ needs.safe_outputs.outputs.code_push_failure_count }} GH_AW_LOCKDOWN_CHECK_FAILED: ${{ needs.activation.outputs.lockdown_check_failed }} GH_AW_STALE_LOCK_FILE_FAILED: ${{ needs.activation.outputs.stale_lock_file_failed }} - GH_AW_PUSH_REPO_MEMORY_RESULT: ${{ needs.push_repo_memory.result }} - GH_AW_REPO_MEMORY_VALIDATION_FAILED_default: ${{ needs.push_repo_memory.outputs.validation_failed_default }} - GH_AW_REPO_MEMORY_VALIDATION_ERROR_default: ${{ needs.push_repo_memory.outputs.validation_error_default }} - GH_AW_REPO_MEMORY_PATCH_SIZE_EXCEEDED_default: ${{ needs.push_repo_memory.outputs.patch_size_exceeded_default }} GH_AW_GROUP_REPORTS: "false" GH_AW_FAILURE_REPORT_AS_ISSUE: "true" - GH_AW_TIMEOUT_MINUTES: "60" + GH_AW_TIMEOUT_MINUTES: "55" with: github-token: ${{ secrets.GH_AW_GITHUB_TOKEN || secrets.GITHUB_TOKEN }} script: | @@ -1419,79 +1364,6 @@ jobs: const { main } = require('${{ runner.temp }}/gh-aw/actions/parse_threat_detection_results.cjs'); await main(); - push_repo_memory: - needs: - - activation - - agent - - detection - if: > - always() && (!cancelled()) && (needs.detection.result == 'success' || needs.detection.result == 'skipped') && - needs.agent.result != 'skipped' - runs-on: ubuntu-slim - permissions: - contents: write - concurrency: - group: "push-repo-memory-${{ github.repository }}|memory/news-generation" - cancel-in-progress: false - outputs: - patch_size_exceeded_default: ${{ steps.push_repo_memory_default.outputs.patch_size_exceeded }} - validation_error_default: ${{ steps.push_repo_memory_default.outputs.validation_error }} - validation_failed_default: ${{ steps.push_repo_memory_default.outputs.validation_failed }} - steps: - - name: Setup Scripts - id: setup - uses: github/gh-aw-actions/setup@006ffd856b868b71df342dbe0ba082a963249b31 # v0.69.3 - with: - destination: ${{ runner.temp }}/gh-aw/actions - job-name: ${{ github.job }} - trace-id: ${{ needs.activation.outputs.setup-trace-id }} - - name: Checkout repository - uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2 - with: - persist-credentials: false - sparse-checkout: . - - name: Configure Git credentials - env: - REPO_NAME: ${{ github.repository }} - SERVER_URL: ${{ github.server_url }} - GITHUB_TOKEN: ${{ github.token }} - run: | - git config --global user.email "github-actions[bot]@users.noreply.github.com" - git config --global user.name "github-actions[bot]" - git config --global am.keepcr true - # Re-authenticate git with GitHub token - SERVER_URL_STRIPPED="${SERVER_URL#https://}" - git remote set-url origin "https://x-access-token:${GITHUB_TOKEN}@${SERVER_URL_STRIPPED}/${REPO_NAME}.git" - echo "Git configured with standard GitHub Actions identity" - - name: Download repo-memory artifact (default) - uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1 - continue-on-error: true - with: - name: repo-memory-default - path: /tmp/gh-aw/repo-memory/default - - name: Push repo-memory changes (default) - id: push_repo_memory_default - if: always() - uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 - env: - GH_TOKEN: ${{ github.token }} - GITHUB_RUN_ID: ${{ github.run_id }} - GITHUB_SERVER_URL: ${{ github.server_url }} - ARTIFACT_DIR: /tmp/gh-aw/repo-memory/default - MEMORY_ID: default - TARGET_REPO: ${{ github.repository }} - BRANCH_NAME: memory/news-generation - MAX_FILE_SIZE: 51200 - MAX_FILE_COUNT: 50 - MAX_PATCH_SIZE: 51200 - ALLOWED_EXTENSIONS: '[".md",".json"]' - with: - script: | - const { setupGlobals } = require('${{ runner.temp }}/gh-aw/actions/setup_globals.cjs'); - setupGlobals(core, github, context, exec, io, getOctokit); - const { main } = require('${{ runner.temp }}/gh-aw/actions/push_repo_memory.cjs'); - await main(); - safe_outputs: needs: - activation @@ -1590,7 +1462,7 @@ jobs: uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 env: GH_AW_AGENT_OUTPUT: ${{ steps.setup-agent-output-env.outputs.GH_AW_AGENT_OUTPUT }} - GH_AW_ALLOWED_DOMAINS: "*.githubusercontent.com,api.business.githubcopilot.com,api.enterprise.githubcopilot.com,api.github.com,api.githubcopilot.com,api.imf.org,api.individual.githubcopilot.com,api.npms.io,api.scb.se,api.snapcraft.io,api.worldbank.org,archive.ubuntu.com,azure.archive.ubuntu.com,bun.sh,cdn.jsdelivr.net,cdn.playwright.dev,codeload.github.com,crl.geotrust.com,crl.globalsign.com,crl.identrust.com,crl.sectigo.com,crl.thawte.com,crl.usertrust.com,crl.verisign.com,crl3.digicert.com,crl4.digicert.com,crls.ssl.com,data.imf.org,data.riksdagen.se,deb.nodesource.com,deno.land,docs.github.com,esm.sh,get.pnpm.io,github-cloud.githubusercontent.com,github-cloud.s3.amazonaws.com,github.blog,github.com,github.githubassets.com,googleapis.deno.dev,googlechromelabs.github.io,hack23.com,hack23.github.io,host.docker.internal,json-schema.org,json.schemastore.org,jsr.io,keyserver.ubuntu.com,lfs.github.com,localhost,nodejs.org,npm.pkg.github.com,npmjs.com,npmjs.org,objects.githubusercontent.com,ocsp.digicert.com,ocsp.geotrust.com,ocsp.globalsign.com,ocsp.identrust.com,ocsp.sectigo.com,ocsp.ssl.com,ocsp.thawte.com,ocsp.usertrust.com,ocsp.verisign.com,packagecloud.io,packages.cloud.google.com,packages.microsoft.com,playwright.download.prss.microsoft.com,ppa.launchpad.net,raw.githubusercontent.com,regeringen.se,registry.bower.io,registry.npmjs.com,registry.npmjs.org,registry.yarnpkg.com,repo.yarnpkg.com,riksdag-regering-ai.onrender.com,riksdagen.se,riksdagsmonitor.com,s.symcb.com,s.symcd.com,security.ubuntu.com,skimdb.npmjs.com,storage.googleapis.com,telemetry.enterprise.githubcopilot.com,telemetry.vercel.com,ts-crl.ws.symantec.com,ts-ocsp.ws.symantec.com,www.googleapis.com,www.hack23.com,www.imf.org,www.npmjs.com,www.npmjs.org,www.regeringen.se,www.riksdagen.se,www.riksdagsmonitor.com,www.scb.se,yarnpkg.com" + GH_AW_ALLOWED_DOMAINS: "*.githubusercontent.com,api.business.githubcopilot.com,api.enterprise.githubcopilot.com,api.github.com,api.githubcopilot.com,api.imf.org,api.individual.githubcopilot.com,api.npms.io,api.scb.se,api.snapcraft.io,api.worldbank.org,archive.ubuntu.com,azure.archive.ubuntu.com,bun.sh,cdn.jsdelivr.net,codeload.github.com,crl.geotrust.com,crl.globalsign.com,crl.identrust.com,crl.sectigo.com,crl.thawte.com,crl.usertrust.com,crl.verisign.com,crl3.digicert.com,crl4.digicert.com,crls.ssl.com,data.imf.org,data.riksdagen.se,deb.nodesource.com,deno.land,docs.github.com,esm.sh,get.pnpm.io,github-cloud.githubusercontent.com,github-cloud.s3.amazonaws.com,github.blog,github.com,github.githubassets.com,googleapis.deno.dev,googlechromelabs.github.io,hack23.com,hack23.github.io,host.docker.internal,json-schema.org,json.schemastore.org,jsr.io,keyserver.ubuntu.com,lfs.github.com,localhost,nodejs.org,npm.pkg.github.com,npmjs.com,npmjs.org,objects.githubusercontent.com,ocsp.digicert.com,ocsp.geotrust.com,ocsp.globalsign.com,ocsp.identrust.com,ocsp.sectigo.com,ocsp.ssl.com,ocsp.thawte.com,ocsp.usertrust.com,ocsp.verisign.com,packagecloud.io,packages.cloud.google.com,packages.microsoft.com,ppa.launchpad.net,raw.githubusercontent.com,regeringen.se,registry.bower.io,registry.npmjs.com,registry.npmjs.org,registry.yarnpkg.com,repo.yarnpkg.com,riksdag-regering-ai.onrender.com,riksdagen.se,riksdagsmonitor.com,s.symcb.com,s.symcd.com,security.ubuntu.com,skimdb.npmjs.com,storage.googleapis.com,telemetry.enterprise.githubcopilot.com,telemetry.vercel.com,ts-crl.ws.symantec.com,ts-ocsp.ws.symantec.com,www.googleapis.com,www.hack23.com,www.imf.org,www.npmjs.com,www.npmjs.org,www.regeringen.se,www.riksdagen.se,www.riksdagsmonitor.com,www.scb.se,yarnpkg.com" GITHUB_SERVER_URL: ${{ github.server_url }} GITHUB_API_URL: ${{ github.api_url }} GH_AW_SAFE_OUTPUTS_HANDLER_CONFIG: "{\"add_comment\":{\"max\":1},\"create_pull_request\":{\"draft\":false,\"expires\":336,\"labels\":[\"agentic-news\",\"analysis-data\"],\"max\":1,\"max_patch_size\":4096,\"protected_files\":[\"package.json\",\"bun.lockb\",\"bunfig.toml\",\"deno.json\",\"deno.jsonc\",\"deno.lock\",\"global.json\",\"NuGet.Config\",\"Directory.Packages.props\",\"mix.exs\",\"mix.lock\",\"go.mod\",\"go.sum\",\"stack.yaml\",\"stack.yaml.lock\",\"pom.xml\",\"build.gradle\",\"build.gradle.kts\",\"settings.gradle\",\"settings.gradle.kts\",\"gradle.properties\",\"package-lock.json\",\"yarn.lock\",\"pnpm-lock.yaml\",\"npm-shrinkwrap.json\",\"requirements.txt\",\"Pipfile\",\"Pipfile.lock\",\"pyproject.toml\",\"setup.py\",\"setup.cfg\",\"Gemfile\",\"Gemfile.lock\",\"uv.lock\",\"CODEOWNERS\",\"AGENTS.md\",\"CLAUDE.md\",\"GEMINI.md\"],\"protected_path_prefixes\":[\".github/\",\".agents/\"]},\"create_report_incomplete_issue\":{},\"dispatch_workflow\":{\"max\":1,\"workflow_files\":{\"news-translate\":\".lock.yml\"},\"workflows\":[\"news-translate\"]},\"missing_data\":{},\"missing_tool\":{},\"noop\":{\"max\":1,\"report-as-issue\":\"true\"},\"report_incomplete\":{}}" diff --git a/.github/workflows/news-article-generator.md b/.github/workflows/news-article-generator.md index 7475f309a..867874cb1 100644 --- a/.github/workflows/news-article-generator.md +++ b/.github/workflows/news-article-generator.md @@ -49,12 +49,21 @@ permissions: discussions: read security-events: read -timeout-minutes: 60 +timeout-minutes: 55 concurrency: group: gh-aw-news-article-generator-${{ inputs.article_types || 'manual' }} cancel-in-progress: false +features: + mcp-gateway: true + +sandbox: + agent: awf + mcp: + port: 8080 + keepalive-interval: 300 # 5m ping to keep MCP connections alive; Copilot API token expires ~60min so PR must be created within 25min of agent start + runtimes: node: version: "25" @@ -106,13 +115,6 @@ tools: - all agentic-workflows: true bash: true - playwright: - repo-memory: - branch-name: memory/news-generation - allowed-extensions: [".md", ".json"] - max-file-size: 51200 - max-file-count: 50 - max-patch-size: 51200 safe-outputs: allowed-domains: @@ -242,38 +244,47 @@ This workflow imports `../prompts/ext/tier-c-aggregation.md`. Produce **all 14 a - **Article type**: `multi` - **Analysis subfolder**: `analysis/daily/$ARTICLE_DATE//` - **Core languages produced**: `en`, `sv` (remaining 12 languages dispatched to `news-translate`) -- **One pull request per run** containing analysis + articles + visualisation data. +- **Two-run model**: Run 1 produces an `analysis-only` PR (14 artifacts per type); Run 2 detects existing analysis and produces an articles PR. + +## Time budget + +**Run 1 — Analysis mode** (no prior analysis found, ~45 min): + +| Minutes | Phase | Module | +|---------|-------|--------| +| 0–2 | MCP pre-warm + pre-flight analysis check | 02 / 03 | +| 2–7 | Download data + catalogue | 03 | +| 7–27 | Analysis Pass 1 (methodology read + per-doc analyses + 14 artifacts incl. 5 Tier-C) | 04 / ext | +| 27–38 | Analysis Pass 2 (read-back + improvements) | 04 | +| 38–41 | Analysis Gate (Tier-C extended gate) | 05 | +| 41–45 | Stage analysis, commit, **ONE** `safeoutputs___create_pull_request` (analysis-only) | 07 | -## Time budget (60 min, minimum 45 min of real work) +**Run 2 — Article mode** (analysis exists on disk, ~25 min): | Minutes | Phase | Module | |---------|-------|--------| -| 0–2 | MCP pre-warm + `get_sync_status` | 02 | -| 2–6 | Download data + catalogue | 03 | -| 6–25 | Analysis Pass 1 (methodology read + per-doc analyses + 9 artifacts) | 04 | -| 25–35 | Analysis Pass 2 (read-back + improvements) | 04 | -| 35–37 | Analysis Gate | 05 | -| 37–48 | Article Pass 1 + Pass 2 (EN, SV) | 06 | -| 48–55 | Visual + link validation | 06 | -| 55–60 | Stage, commit, **ONE** `safeoutputs___create_pull_request` | 07 | +| 0–2 | MCP pre-warm + pre-flight check (SKIP_ANALYSIS=true) | 02 / 03 | +| 2–5 | Read all 14 analysis artifacts into context | 06 | +| 5–18 | Article Pass 1 + Pass 2 (EN, SV) | 06 | +| 18–22 | Visual + link validation | 06 | +| 22–25 | Stage articles, commit, **ONE** `safeoutputs___create_pull_request` | 07 | -Trim scope before quality. Never open a second PR to "save" partial work — there is no second PR. +Trim scope before quality. Never open a second PR within a run — there is no second PR. ## Inputs - `article_date` — override date (defaults to today) -- `force_generation` — regenerate even if today's article exists (analysis is always refreshed regardless) +- `force_generation` — regenerate even if today's article exists; also forces analysis re-run - `languages` — core content languages (default `en,sv`) - `analysis_depth` — `standard` | `deep` (default) | `comprehensive` -## Dedup & analysis-only path +## Run-mode selection -If articles for `$ARTICLE_DATE` + `multi` already exist **and** `force_generation=false`: +At the start of every run, the pre-flight check in `03-data-download.md` detects whether `analysis/daily/$ARTICLE_DATE//` already contains all 9 core artifacts: -- Still run the full analysis pipeline (modules 03 → 04 → 05). -- Commit the analysis. -- Open the single PR with title `📊 Analysis Only — Article Generator (Manual) — $ARTICLE_DATE` and label `analysis-only`. +- **No analysis found** → Analysis mode: download data, run Pass 1 + Pass 2 + Tier-C Gate (14 artifacts), commit analysis artifacts, open `analysis-only` PR, stop. +- **Analysis found** → Article mode: read existing analysis, generate articles, commit articles, open articles PR + dispatch `news-translate`. -Analysis is the primary product — a run never "does nothing" just because articles exist. +Repeated runs for the same `$ARTICLE_DATE` always use the same analysis folder when `force_generation=false`. Analysis is the primary product — a run never produces nothing. All other rules (bash format, AWF shell safety, MCP access, download pipeline, analysis methodology & gate, article generation, commit & PR policy) live in the imported modules. diff --git a/.github/workflows/news-committee-reports.lock.yml b/.github/workflows/news-committee-reports.lock.yml index b05b76ba5..2e6200edc 100644 --- a/.github/workflows/news-committee-reports.lock.yml +++ b/.github/workflows/news-committee-reports.lock.yml @@ -1,4 +1,4 @@ -# gh-aw-metadata: {"schema_version":"v3","frontmatter_hash":"85260a8672a99d9dc6656fd713ac8b302d0edb3fecc12c978b6e83fe9e42ff0b","compiler_version":"v0.69.3","agent_id":"copilot","agent_model":"claude-opus-4.7"} +# gh-aw-metadata: {"schema_version":"v3","frontmatter_hash":"fc12458d8971651c1ab84fe996bfec966a37ad3b1acfdf4bffebeb5d548511a0","compiler_version":"v0.69.3","agent_id":"copilot","agent_model":"claude-opus-4.7"} # gh-aw-manifest: {"version":1,"secrets":["COPILOT_GITHUB_TOKEN","GH_AW_CI_TRIGGER_TOKEN","GH_AW_GITHUB_MCP_SERVER_TOKEN","GH_AW_GITHUB_TOKEN","GITHUB_TOKEN"],"actions":[{"repo":"actions/checkout","sha":"de0fac2e4500dabe0009e67214ff5f5447ce83dd","version":"v6.0.2"},{"repo":"actions/download-artifact","sha":"3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c","version":"v8.0.1"},{"repo":"actions/github-script","sha":"373c709c69115d41ff229c7e5df9f8788daa9553","version":"v9"},{"repo":"actions/setup-node","sha":"6044e13b5dc448c55e2357c09f80417699197238","version":"6044e13b5dc448c55e2357c09f80417699197238"},{"repo":"actions/upload-artifact","sha":"043fb46d1a93c77aae656e7c1c64a875d1fc6a0a","version":"v7.0.1"},{"repo":"github/gh-aw-actions/setup","sha":"006ffd856b868b71df342dbe0ba082a963249b31","version":"v0.69.3"}],"containers":[{"image":"alpine:latest"},{"image":"ghcr.io/github/gh-aw-firewall/agent:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/api-proxy:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/squid:0.25.26"},{"image":"ghcr.io/github/gh-aw-mcpg:v0.2.26"},{"image":"ghcr.io/github/github-mcp-server:v1.0.0"},{"image":"node:25-alpine"},{"image":"node:lts-alpine"}]} # ___ _ _ # / _ \ | | (_) @@ -203,26 +203,24 @@ jobs: GH_AW_GITHUB_REPOSITORY: ${{ github.repository }} GH_AW_GITHUB_RUN_ID: ${{ github.run_id }} GH_AW_GITHUB_WORKSPACE: ${{ github.workspace }} - GH_AW_WIKI_NOTE: ${{ '' }} # poutine:ignore untrusted_checkout_exec run: | bash "${RUNNER_TEMP}/gh-aw/actions/create_prompt_first.sh" { - cat << 'GH_AW_PROMPT_415fc1bfa15f3ea6_EOF' + cat << 'GH_AW_PROMPT_262ed51ea3835769_EOF' - GH_AW_PROMPT_415fc1bfa15f3ea6_EOF + GH_AW_PROMPT_262ed51ea3835769_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/xpia.md" cat "${RUNNER_TEMP}/gh-aw/prompts/temp_folder_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/markdown.md" cat "${RUNNER_TEMP}/gh-aw/prompts/agentic_workflows_guide.md" - cat "${RUNNER_TEMP}/gh-aw/prompts/repo_memory_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/safe_outputs_prompt.md" - cat << 'GH_AW_PROMPT_415fc1bfa15f3ea6_EOF' + cat << 'GH_AW_PROMPT_262ed51ea3835769_EOF' Tools: add_comment, create_pull_request, dispatch_workflow, missing_tool, missing_data, noop - GH_AW_PROMPT_415fc1bfa15f3ea6_EOF + GH_AW_PROMPT_262ed51ea3835769_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/safe_outputs_create_pull_request.md" - cat << 'GH_AW_PROMPT_415fc1bfa15f3ea6_EOF' + cat << 'GH_AW_PROMPT_262ed51ea3835769_EOF' The following GitHub context information is available for this workflow: @@ -252,9 +250,9 @@ jobs: {{/if}} - GH_AW_PROMPT_415fc1bfa15f3ea6_EOF + GH_AW_PROMPT_262ed51ea3835769_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/github_mcp_tools_with_safeoutputs_prompt.md" - cat << 'GH_AW_PROMPT_415fc1bfa15f3ea6_EOF' + cat << 'GH_AW_PROMPT_262ed51ea3835769_EOF' {{#runtime-import .github/prompts/00-base-contract.md}} {{#runtime-import .github/prompts/01-bash-and-shell-safety.md}} @@ -265,7 +263,7 @@ jobs: {{#runtime-import .github/prompts/06-article-generation.md}} {{#runtime-import .github/prompts/07-commit-and-pr.md}} {{#runtime-import .github/workflows/news-committee-reports.md}} - GH_AW_PROMPT_415fc1bfa15f3ea6_EOF + GH_AW_PROMPT_262ed51ea3835769_EOF } > "$GH_AW_PROMPT" - name: Interpolate variables and render templates uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 @@ -289,12 +287,6 @@ jobs: GH_AW_GITHUB_REPOSITORY: ${{ github.repository }} GH_AW_GITHUB_RUN_ID: ${{ github.run_id }} GH_AW_GITHUB_WORKSPACE: ${{ github.workspace }} - GH_AW_MEMORY_BRANCH_NAME: 'memory/news-generation' - GH_AW_MEMORY_CONSTRAINTS: "\n\n**Constraints:**\n- **Max File Size**: 51200 bytes (0.05 MB) per file\n- **Max File Count**: 50 files per commit\n- **Max Patch Size**: 51200 bytes (50 KB) total per push (max: 100 KB)\n" - GH_AW_MEMORY_DESCRIPTION: '' - GH_AW_MEMORY_DIR: '/tmp/gh-aw/repo-memory/default/' - GH_AW_MEMORY_TARGET_REPO: ' of the current repository' - GH_AW_WIKI_NOTE: '' with: script: | const { setupGlobals } = require('${{ runner.temp }}/gh-aw/actions/setup_globals.cjs'); @@ -313,13 +305,7 @@ jobs: GH_AW_GITHUB_EVENT_PULL_REQUEST_NUMBER: process.env.GH_AW_GITHUB_EVENT_PULL_REQUEST_NUMBER, GH_AW_GITHUB_REPOSITORY: process.env.GH_AW_GITHUB_REPOSITORY, GH_AW_GITHUB_RUN_ID: process.env.GH_AW_GITHUB_RUN_ID, - GH_AW_GITHUB_WORKSPACE: process.env.GH_AW_GITHUB_WORKSPACE, - GH_AW_MEMORY_BRANCH_NAME: process.env.GH_AW_MEMORY_BRANCH_NAME, - GH_AW_MEMORY_CONSTRAINTS: process.env.GH_AW_MEMORY_CONSTRAINTS, - GH_AW_MEMORY_DESCRIPTION: process.env.GH_AW_MEMORY_DESCRIPTION, - GH_AW_MEMORY_DIR: process.env.GH_AW_MEMORY_DIR, - GH_AW_MEMORY_TARGET_REPO: process.env.GH_AW_MEMORY_TARGET_REPO, - GH_AW_WIKI_NOTE: process.env.GH_AW_WIKI_NOTE + GH_AW_GITHUB_WORKSPACE: process.env.GH_AW_GITHUB_WORKSPACE } }); - name: Validate prompt placeholders @@ -414,16 +400,6 @@ jobs: - name: Pre-flight external endpoint reachability check (runs before MCP Gateway) run: "echo \"🔍 Network Diagnostics — $(date -u '+%Y-%m-%dT%H:%M:%SZ')\"\necho \"═══════════════════════════════════════════\"\necho \"\"\necho \"📡 DNS Resolution Tests:\"\nfor domain in riksdag-regering-ai.onrender.com api.scb.se api.worldbank.org data.riksdagen.se www.riksdagen.se www.regeringen.se; do\n if nslookup \"$domain\" >/dev/null 2>&1; then\n IP=$(nslookup \"$domain\" 2>/dev/null | grep -A1 \"Name:\" | grep \"Address:\" | head -1 | awk '{print $2}')\n echo \" ✅ $domain → $IP\"\n else\n echo \" ❌ $domain — DNS FAILED\"\n fi\ndone\necho \"\"\necho \"🌐 HTTPS Connectivity Tests:\"\nfor url in \\\n \"https://riksdag-regering-ai.onrender.com/mcp\" \\\n \"https://api.scb.se/OV0104/v2beta\" \\\n \"https://api.worldbank.org/v2/country/SE?format=json\" \\\n \"https://data.riksdagen.se/dokumentlista/?sok=test&doktyp=bet&utformat=json&a=1\" \\\n; do\n HTTP_CODE=$(curl -s -o /dev/null -w \"%{http_code}\" --max-time 10 \"$url\" 2>/dev/null || echo \"000\")\n DOMAIN=$(echo \"$url\" | sed 's|https://||' | cut -d/ -f1)\n if [ \"$HTTP_CODE\" -ge 200 ] && [ \"$HTTP_CODE\" -lt 400 ]; then\n echo \" ✅ $DOMAIN → HTTP $HTTP_CODE\"\n elif [ \"$HTTP_CODE\" = \"000\" ]; then\n echo \" ❌ $DOMAIN → TIMEOUT/UNREACHABLE\"\n else\n echo \" ⚠️ $DOMAIN → HTTP $HTTP_CODE\"\n fi\ndone\necho \"\"\necho \"🔌 MCP Server Tool Count:\"\nTOOL_RESP=$(curl -sf --max-time 15 -X POST \\\n -H \"Content-Type: application/json\" \\\n -d '{\"jsonrpc\":\"2.0\",\"id\":1,\"method\":\"tools/list\",\"params\":{}}' \\\n \"https://riksdag-regering-ai.onrender.com/mcp\" 2>/dev/null) || TOOL_RESP=\"\"\nif echo \"$TOOL_RESP\" | grep -q '\"tools\"'; then\n TOOL_COUNT=$(echo \"$TOOL_RESP\" | grep -o '\"name\"' | wc -l)\n echo \" ✅ riksdag-regering MCP: $TOOL_COUNT tools registered\"\nelse\n echo \" ❌ riksdag-regering MCP: No tools response (server may still be starting)\"\nfi\necho \"\"\necho \"═══════════════════════════════════════════\"\n" - # Repo memory git-based storage configuration from frontmatter processed below - - name: Clone repo-memory branch (default) - env: - GH_TOKEN: ${{ github.token }} - GITHUB_SERVER_URL: ${{ github.server_url }} - BRANCH_NAME: memory/news-generation - TARGET_REPO: ${{ github.repository }} - MEMORY_DIR: /tmp/gh-aw/repo-memory/default - CREATE_ORPHAN: true - run: bash "${RUNNER_TEMP}/gh-aw/actions/clone_repo_memory_branch.sh" - name: Configure Git credentials env: REPO_NAME: ${{ github.repository }} @@ -498,9 +474,9 @@ jobs: mkdir -p "${RUNNER_TEMP}/gh-aw/safeoutputs" mkdir -p /tmp/gh-aw/safeoutputs mkdir -p /tmp/gh-aw/mcp-logs/safeoutputs - cat > "${RUNNER_TEMP}/gh-aw/safeoutputs/config.json" << 'GH_AW_SAFE_OUTPUTS_CONFIG_95936ae540a1c48f_EOF' - {"add_comment":{"max":1},"create_pull_request":{"draft":false,"expires":336,"labels":["agentic-news","analysis-data"],"max":1,"max_patch_size":4096,"protected_files":["package.json","bun.lockb","bunfig.toml","deno.json","deno.jsonc","deno.lock","global.json","NuGet.Config","Directory.Packages.props","mix.exs","mix.lock","go.mod","go.sum","stack.yaml","stack.yaml.lock","pom.xml","build.gradle","build.gradle.kts","settings.gradle","settings.gradle.kts","gradle.properties","package-lock.json","yarn.lock","pnpm-lock.yaml","npm-shrinkwrap.json","requirements.txt","Pipfile","Pipfile.lock","pyproject.toml","setup.py","setup.cfg","Gemfile","Gemfile.lock","uv.lock","CODEOWNERS","AGENTS.md","CLAUDE.md","GEMINI.md"],"protected_path_prefixes":[".github/",".agents/"]},"create_report_incomplete_issue":{},"dispatch_workflow":{"max":1,"workflow_files":{"news-translate":".lock.yml"},"workflows":["news-translate"]},"missing_data":{},"missing_tool":{},"noop":{"max":1,"report-as-issue":"true"},"push_repo_memory":{"memories":[{"dir":"/tmp/gh-aw/repo-memory/default","id":"default","max_file_count":50,"max_file_size":51200,"max_patch_size":51200}]},"report_incomplete":{}} - GH_AW_SAFE_OUTPUTS_CONFIG_95936ae540a1c48f_EOF + cat > "${RUNNER_TEMP}/gh-aw/safeoutputs/config.json" << 'GH_AW_SAFE_OUTPUTS_CONFIG_9244eaa699934fcb_EOF' + {"add_comment":{"max":1},"create_pull_request":{"draft":false,"expires":336,"labels":["agentic-news","analysis-data"],"max":1,"max_patch_size":4096,"protected_files":["package.json","bun.lockb","bunfig.toml","deno.json","deno.jsonc","deno.lock","global.json","NuGet.Config","Directory.Packages.props","mix.exs","mix.lock","go.mod","go.sum","stack.yaml","stack.yaml.lock","pom.xml","build.gradle","build.gradle.kts","settings.gradle","settings.gradle.kts","gradle.properties","package-lock.json","yarn.lock","pnpm-lock.yaml","npm-shrinkwrap.json","requirements.txt","Pipfile","Pipfile.lock","pyproject.toml","setup.py","setup.cfg","Gemfile","Gemfile.lock","uv.lock","CODEOWNERS","AGENTS.md","CLAUDE.md","GEMINI.md"],"protected_path_prefixes":[".github/",".agents/"]},"create_report_incomplete_issue":{},"dispatch_workflow":{"max":1,"workflow_files":{"news-translate":".lock.yml"},"workflows":["news-translate"]},"missing_data":{},"missing_tool":{},"noop":{"max":1,"report-as-issue":"true"},"report_incomplete":{}} + GH_AW_SAFE_OUTPUTS_CONFIG_9244eaa699934fcb_EOF - name: Write Safe Outputs Tools env: GH_AW_TOOLS_META_JSON: | @@ -766,7 +742,7 @@ jobs: mkdir -p /home/runner/.copilot GH_AW_NODE=$(which node 2>/dev/null || command -v node 2>/dev/null || echo node) - cat << GH_AW_MCP_CONFIG_a88ce2a3bc0403de_EOF | "$GH_AW_NODE" "${RUNNER_TEMP}/gh-aw/actions/start_mcp_gateway.cjs" + cat << GH_AW_MCP_CONFIG_ef29cce582f7dd34_EOF | "$GH_AW_NODE" "${RUNNER_TEMP}/gh-aw/actions/start_mcp_gateway.cjs" { "mcpServers": { "agenticworkflows": { @@ -879,10 +855,11 @@ jobs: "port": $MCP_GATEWAY_PORT, "domain": "${MCP_GATEWAY_DOMAIN}", "apiKey": "${MCP_GATEWAY_API_KEY}", - "payloadDir": "${MCP_GATEWAY_PAYLOAD_DIR}" + "payloadDir": "${MCP_GATEWAY_PAYLOAD_DIR}", + "keepaliveInterval": 300 } } - GH_AW_MCP_CONFIG_a88ce2a3bc0403de_EOF + GH_AW_MCP_CONFIG_ef29cce582f7dd34_EOF - name: Download activation artifact uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1 with: @@ -900,7 +877,7 @@ jobs: - name: Execute GitHub Copilot CLI id: agentic_execution # Copilot CLI tool arguments (sorted): - timeout-minutes: 60 + timeout-minutes: 55 run: | set -o pipefail touch /tmp/gh-aw/agent-step-summary.md @@ -1058,15 +1035,6 @@ jobs: if [ ! -f /tmp/gh-aw/agent_output.json ]; then echo '{"items":[]}' > /tmp/gh-aw/agent_output.json fi - # Upload repo memory as artifacts for push job - - name: Upload repo-memory artifact (default) - if: always() - uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1 - with: - name: repo-memory-default - path: /tmp/gh-aw/repo-memory/default - retention-days: 1 - if-no-files-found: ignore - name: Upload agent artifacts if: always() continue-on-error: true @@ -1095,7 +1063,6 @@ jobs: - activation - agent - detection - - push_repo_memory - safe_outputs if: > always() && (needs.agent.result != 'skipped' || needs.activation.outputs.lockdown_check_failed == 'true' || @@ -1220,13 +1187,9 @@ jobs: GH_AW_CODE_PUSH_FAILURE_COUNT: ${{ needs.safe_outputs.outputs.code_push_failure_count }} GH_AW_LOCKDOWN_CHECK_FAILED: ${{ needs.activation.outputs.lockdown_check_failed }} GH_AW_STALE_LOCK_FILE_FAILED: ${{ needs.activation.outputs.stale_lock_file_failed }} - GH_AW_PUSH_REPO_MEMORY_RESULT: ${{ needs.push_repo_memory.result }} - GH_AW_REPO_MEMORY_VALIDATION_FAILED_default: ${{ needs.push_repo_memory.outputs.validation_failed_default }} - GH_AW_REPO_MEMORY_VALIDATION_ERROR_default: ${{ needs.push_repo_memory.outputs.validation_error_default }} - GH_AW_REPO_MEMORY_PATCH_SIZE_EXCEEDED_default: ${{ needs.push_repo_memory.outputs.patch_size_exceeded_default }} GH_AW_GROUP_REPORTS: "false" GH_AW_FAILURE_REPORT_AS_ISSUE: "true" - GH_AW_TIMEOUT_MINUTES: "60" + GH_AW_TIMEOUT_MINUTES: "55" with: github-token: ${{ secrets.GH_AW_GITHUB_TOKEN || secrets.GITHUB_TOKEN }} script: | @@ -1395,79 +1358,6 @@ jobs: const { main } = require('${{ runner.temp }}/gh-aw/actions/parse_threat_detection_results.cjs'); await main(); - push_repo_memory: - needs: - - activation - - agent - - detection - if: > - always() && (!cancelled()) && (needs.detection.result == 'success' || needs.detection.result == 'skipped') && - needs.agent.result != 'skipped' - runs-on: ubuntu-slim - permissions: - contents: write - concurrency: - group: "push-repo-memory-${{ github.repository }}|memory/news-generation" - cancel-in-progress: false - outputs: - patch_size_exceeded_default: ${{ steps.push_repo_memory_default.outputs.patch_size_exceeded }} - validation_error_default: ${{ steps.push_repo_memory_default.outputs.validation_error }} - validation_failed_default: ${{ steps.push_repo_memory_default.outputs.validation_failed }} - steps: - - name: Setup Scripts - id: setup - uses: github/gh-aw-actions/setup@006ffd856b868b71df342dbe0ba082a963249b31 # v0.69.3 - with: - destination: ${{ runner.temp }}/gh-aw/actions - job-name: ${{ github.job }} - trace-id: ${{ needs.activation.outputs.setup-trace-id }} - - name: Checkout repository - uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2 - with: - persist-credentials: false - sparse-checkout: . - - name: Configure Git credentials - env: - REPO_NAME: ${{ github.repository }} - SERVER_URL: ${{ github.server_url }} - GITHUB_TOKEN: ${{ github.token }} - run: | - git config --global user.email "github-actions[bot]@users.noreply.github.com" - git config --global user.name "github-actions[bot]" - git config --global am.keepcr true - # Re-authenticate git with GitHub token - SERVER_URL_STRIPPED="${SERVER_URL#https://}" - git remote set-url origin "https://x-access-token:${GITHUB_TOKEN}@${SERVER_URL_STRIPPED}/${REPO_NAME}.git" - echo "Git configured with standard GitHub Actions identity" - - name: Download repo-memory artifact (default) - uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1 - continue-on-error: true - with: - name: repo-memory-default - path: /tmp/gh-aw/repo-memory/default - - name: Push repo-memory changes (default) - id: push_repo_memory_default - if: always() - uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 - env: - GH_TOKEN: ${{ github.token }} - GITHUB_RUN_ID: ${{ github.run_id }} - GITHUB_SERVER_URL: ${{ github.server_url }} - ARTIFACT_DIR: /tmp/gh-aw/repo-memory/default - MEMORY_ID: default - TARGET_REPO: ${{ github.repository }} - BRANCH_NAME: memory/news-generation - MAX_FILE_SIZE: 51200 - MAX_FILE_COUNT: 50 - MAX_PATCH_SIZE: 51200 - ALLOWED_EXTENSIONS: '[".md",".json"]' - with: - script: | - const { setupGlobals } = require('${{ runner.temp }}/gh-aw/actions/setup_globals.cjs'); - setupGlobals(core, github, context, exec, io, getOctokit); - const { main } = require('${{ runner.temp }}/gh-aw/actions/push_repo_memory.cjs'); - await main(); - safe_outputs: needs: - activation diff --git a/.github/workflows/news-committee-reports.md b/.github/workflows/news-committee-reports.md index e094c3b12..1f9d1b7fa 100644 --- a/.github/workflows/news-committee-reports.md +++ b/.github/workflows/news-committee-reports.md @@ -40,12 +40,21 @@ permissions: discussions: read security-events: read -timeout-minutes: 60 +timeout-minutes: 55 concurrency: group: gh-aw-news-committee-reports-${{ inputs.article_date || 'today' }} cancel-in-progress: false +features: + mcp-gateway: true + +sandbox: + agent: awf + mcp: + port: 8080 + keepalive-interval: 300 # 5m ping to keep MCP connections alive; Copilot API token expires ~60min so PR must be created within 25min of agent start + runtimes: node: version: "25" @@ -97,12 +106,6 @@ tools: - all agentic-workflows: true bash: true - repo-memory: - branch-name: memory/news-generation - allowed-extensions: [".md", ".json"] - max-file-size: 51200 - max-file-count: 50 - max-patch-size: 51200 safe-outputs: allowed-domains: @@ -228,38 +231,47 @@ Generates deep political intelligence articles on parliamentary committee report - **Article type**: `committee-reports` - **Analysis subfolder**: `analysis/daily/$ARTICLE_DATE/committeeReports/` - **Core languages produced**: `en`, `sv` (remaining 12 languages dispatched to `news-translate`) -- **One pull request per run** containing analysis + articles + visualisation data. +- **Two-run model**: Run 1 produces an `analysis-only` PR; Run 2 (next scheduled run, same day) detects existing analysis and produces an articles PR. + +## Time budget + +**Run 1 — Analysis mode** (no prior analysis found, ~43 min): + +| Minutes | Phase | Module | +|---------|-------|--------| +| 0–2 | MCP pre-warm + pre-flight analysis check | 02 / 03 | +| 2–7 | Download data + catalogue | 03 | +| 7–27 | Analysis Pass 1 (methodology read + per-doc analyses + 9 artifacts) | 04 | +| 27–38 | Analysis Pass 2 (read-back + improvements) | 04 | +| 38–40 | Analysis Gate | 05 | +| 40–43 | Stage analysis, commit, **ONE** `safeoutputs___create_pull_request` (analysis-only) | 07 | -## Time budget (60 min, minimum 45 min of real work) +**Run 2 — Article mode** (analysis exists on disk, ~25 min): | Minutes | Phase | Module | |---------|-------|--------| -| 0–2 | MCP pre-warm + `get_sync_status` | 02 | -| 2–6 | Download data + catalogue | 03 | -| 6–25 | Analysis Pass 1 (methodology read + per-doc analyses + 9 artifacts) | 04 | -| 25–35 | Analysis Pass 2 (read-back + improvements) | 04 | -| 35–37 | Analysis Gate | 05 | -| 37–48 | Article Pass 1 + Pass 2 (EN, SV) | 06 | -| 48–55 | Visual + link validation | 06 | -| 55–60 | Stage, commit, **ONE** `safeoutputs___create_pull_request` | 07 | +| 0–2 | MCP pre-warm + pre-flight check (SKIP_ANALYSIS=true) | 02 / 03 | +| 2–5 | Read all 9 analysis artifacts into context | 06 | +| 5–18 | Article Pass 1 + Pass 2 (EN, SV) | 06 | +| 18–22 | Visual + link validation | 06 | +| 22–25 | Stage articles, commit, **ONE** `safeoutputs___create_pull_request` | 07 | -Trim scope before quality. Never open a second PR to "save" partial work — there is no second PR. +Trim scope before quality. Never open a second PR within a run — there is no second PR. ## Inputs - `article_date` — override date (defaults to today) -- `force_generation` — regenerate even if today's article exists (analysis is always refreshed regardless) +- `force_generation` — regenerate even if today's article exists; also forces analysis re-run - `languages` — core content languages (default `en,sv`) - `analysis_depth` — `standard` | `deep` (default) | `comprehensive` -## Dedup & analysis-only path +## Run-mode selection -If articles for `$ARTICLE_DATE` + `committee-reports` already exist **and** `force_generation=false`: +At the start of every run, the pre-flight check in `03-data-download.md` detects whether `analysis/daily/$ARTICLE_DATE/committeeReports/` already contains all 9 required artifacts: -- Still run the full analysis pipeline (modules 03 → 04 → 05). -- Commit the analysis. -- Open the single PR with title `📊 Analysis Only — Committee Reports — $ARTICLE_DATE` and label `analysis-only`. +- **No analysis found** → Analysis mode: download data, run Pass 1 + Pass 2 + Gate, commit analysis artifacts, open `analysis-only` PR, stop. +- **Analysis found** → Article mode: read existing analysis, generate articles, commit articles, open articles PR + dispatch `news-translate`. -Analysis is the primary product — a run never "does nothing" just because articles exist. +Repeated runs for the same `$ARTICLE_DATE` always use the same analysis folder when `force_generation=false`. Analysis is the primary product — a run never produces nothing. All other rules (bash format, AWF shell safety, MCP access, download pipeline, analysis methodology & gate, article generation, commit & PR policy) live in the imported modules. diff --git a/.github/workflows/news-evening-analysis.lock.yml b/.github/workflows/news-evening-analysis.lock.yml index 064e0452f..d5e059302 100644 --- a/.github/workflows/news-evening-analysis.lock.yml +++ b/.github/workflows/news-evening-analysis.lock.yml @@ -1,4 +1,4 @@ -# gh-aw-metadata: {"schema_version":"v3","frontmatter_hash":"ba1723d47e9431f150544397019675615b429c75e50a650fd099bd8d64e86959","compiler_version":"v0.69.3","agent_id":"copilot","agent_model":"claude-opus-4.7"} +# gh-aw-metadata: {"schema_version":"v3","frontmatter_hash":"60e47e127f50ad5c8dad4b5ebb45160caf522ae3e54063abb1761fb9ec01c41d","compiler_version":"v0.69.3","agent_id":"copilot","agent_model":"claude-opus-4.7"} # gh-aw-manifest: {"version":1,"secrets":["COPILOT_GITHUB_TOKEN","GH_AW_CI_TRIGGER_TOKEN","GH_AW_GITHUB_MCP_SERVER_TOKEN","GH_AW_GITHUB_TOKEN","GITHUB_TOKEN"],"actions":[{"repo":"actions/checkout","sha":"de0fac2e4500dabe0009e67214ff5f5447ce83dd","version":"v6.0.2"},{"repo":"actions/download-artifact","sha":"3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c","version":"v8.0.1"},{"repo":"actions/github-script","sha":"373c709c69115d41ff229c7e5df9f8788daa9553","version":"v9"},{"repo":"actions/setup-node","sha":"6044e13b5dc448c55e2357c09f80417699197238","version":"6044e13b5dc448c55e2357c09f80417699197238"},{"repo":"actions/upload-artifact","sha":"043fb46d1a93c77aae656e7c1c64a875d1fc6a0a","version":"v7.0.1"},{"repo":"github/gh-aw-actions/setup","sha":"006ffd856b868b71df342dbe0ba082a963249b31","version":"v0.69.3"}],"containers":[{"image":"alpine:latest"},{"image":"ghcr.io/github/gh-aw-firewall/agent:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/api-proxy:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/squid:0.25.26"},{"image":"ghcr.io/github/gh-aw-mcpg:v0.2.26"},{"image":"ghcr.io/github/github-mcp-server:v1.0.0"},{"image":"mcr.microsoft.com/playwright/mcp"},{"image":"node:25-alpine"},{"image":"node:lts-alpine"}]} # ___ _ _ # / _ \ | | (_) @@ -208,27 +208,25 @@ jobs: GH_AW_GITHUB_REPOSITORY: ${{ github.repository }} GH_AW_GITHUB_RUN_ID: ${{ github.run_id }} GH_AW_GITHUB_WORKSPACE: ${{ github.workspace }} - GH_AW_WIKI_NOTE: ${{ '' }} # poutine:ignore untrusted_checkout_exec run: | bash "${RUNNER_TEMP}/gh-aw/actions/create_prompt_first.sh" { - cat << 'GH_AW_PROMPT_21fd943019943a15_EOF' + cat << 'GH_AW_PROMPT_b5a8ea6ec5398c89_EOF' - GH_AW_PROMPT_21fd943019943a15_EOF + GH_AW_PROMPT_b5a8ea6ec5398c89_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/xpia.md" cat "${RUNNER_TEMP}/gh-aw/prompts/temp_folder_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/markdown.md" cat "${RUNNER_TEMP}/gh-aw/prompts/playwright_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/agentic_workflows_guide.md" - cat "${RUNNER_TEMP}/gh-aw/prompts/repo_memory_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/safe_outputs_prompt.md" - cat << 'GH_AW_PROMPT_21fd943019943a15_EOF' + cat << 'GH_AW_PROMPT_b5a8ea6ec5398c89_EOF' Tools: add_comment, create_pull_request, dispatch_workflow, missing_tool, missing_data, noop - GH_AW_PROMPT_21fd943019943a15_EOF + GH_AW_PROMPT_b5a8ea6ec5398c89_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/safe_outputs_create_pull_request.md" - cat << 'GH_AW_PROMPT_21fd943019943a15_EOF' + cat << 'GH_AW_PROMPT_b5a8ea6ec5398c89_EOF' The following GitHub context information is available for this workflow: @@ -258,9 +256,9 @@ jobs: {{/if}} - GH_AW_PROMPT_21fd943019943a15_EOF + GH_AW_PROMPT_b5a8ea6ec5398c89_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/github_mcp_tools_with_safeoutputs_prompt.md" - cat << 'GH_AW_PROMPT_21fd943019943a15_EOF' + cat << 'GH_AW_PROMPT_b5a8ea6ec5398c89_EOF' {{#runtime-import .github/prompts/00-base-contract.md}} {{#runtime-import .github/prompts/01-bash-and-shell-safety.md}} @@ -272,7 +270,7 @@ jobs: {{#runtime-import .github/prompts/07-commit-and-pr.md}} {{#runtime-import .github/prompts/ext/tier-c-aggregation.md}} {{#runtime-import .github/workflows/news-evening-analysis.md}} - GH_AW_PROMPT_21fd943019943a15_EOF + GH_AW_PROMPT_b5a8ea6ec5398c89_EOF } > "$GH_AW_PROMPT" - name: Interpolate variables and render templates uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 @@ -296,12 +294,6 @@ jobs: GH_AW_GITHUB_REPOSITORY: ${{ github.repository }} GH_AW_GITHUB_RUN_ID: ${{ github.run_id }} GH_AW_GITHUB_WORKSPACE: ${{ github.workspace }} - GH_AW_MEMORY_BRANCH_NAME: 'memory/news-generation' - GH_AW_MEMORY_CONSTRAINTS: "\n\n**Constraints:**\n- **Max File Size**: 51200 bytes (0.05 MB) per file\n- **Max File Count**: 50 files per commit\n- **Max Patch Size**: 51200 bytes (50 KB) total per push (max: 100 KB)\n" - GH_AW_MEMORY_DESCRIPTION: '' - GH_AW_MEMORY_DIR: '/tmp/gh-aw/repo-memory/default/' - GH_AW_MEMORY_TARGET_REPO: ' of the current repository' - GH_AW_WIKI_NOTE: '' with: script: | const { setupGlobals } = require('${{ runner.temp }}/gh-aw/actions/setup_globals.cjs'); @@ -320,13 +312,7 @@ jobs: GH_AW_GITHUB_EVENT_PULL_REQUEST_NUMBER: process.env.GH_AW_GITHUB_EVENT_PULL_REQUEST_NUMBER, GH_AW_GITHUB_REPOSITORY: process.env.GH_AW_GITHUB_REPOSITORY, GH_AW_GITHUB_RUN_ID: process.env.GH_AW_GITHUB_RUN_ID, - GH_AW_GITHUB_WORKSPACE: process.env.GH_AW_GITHUB_WORKSPACE, - GH_AW_MEMORY_BRANCH_NAME: process.env.GH_AW_MEMORY_BRANCH_NAME, - GH_AW_MEMORY_CONSTRAINTS: process.env.GH_AW_MEMORY_CONSTRAINTS, - GH_AW_MEMORY_DESCRIPTION: process.env.GH_AW_MEMORY_DESCRIPTION, - GH_AW_MEMORY_DIR: process.env.GH_AW_MEMORY_DIR, - GH_AW_MEMORY_TARGET_REPO: process.env.GH_AW_MEMORY_TARGET_REPO, - GH_AW_WIKI_NOTE: process.env.GH_AW_WIKI_NOTE + GH_AW_GITHUB_WORKSPACE: process.env.GH_AW_GITHUB_WORKSPACE } }); - name: Validate prompt placeholders @@ -421,16 +407,6 @@ jobs: - name: Pre-flight external endpoint reachability check (runs before MCP Gateway) run: "echo \"🔍 Network Diagnostics — $(date -u '+%Y-%m-%dT%H:%M:%SZ')\"\necho \"═══════════════════════════════════════════\"\necho \"\"\necho \"📡 DNS Resolution Tests:\"\nfor domain in riksdag-regering-ai.onrender.com api.scb.se api.worldbank.org data.riksdagen.se www.riksdagen.se www.regeringen.se; do\n if nslookup \"$domain\" >/dev/null 2>&1; then\n IP=$(nslookup \"$domain\" 2>/dev/null | grep -A1 \"Name:\" | grep \"Address:\" | head -1 | awk '{print $2}')\n echo \" ✅ $domain → $IP\"\n else\n echo \" ❌ $domain — DNS FAILED\"\n fi\ndone\necho \"\"\necho \"🌐 HTTPS Connectivity Tests:\"\nfor url in \\\n \"https://riksdag-regering-ai.onrender.com/mcp\" \\\n \"https://api.scb.se/OV0104/v2beta\" \\\n \"https://api.worldbank.org/v2/country/SE?format=json\" \\\n \"https://data.riksdagen.se/dokumentlista/?sok=test&doktyp=bet&utformat=json&a=1\" \\\n; do\n HTTP_CODE=$(curl -s -o /dev/null -w \"%{http_code}\" --max-time 10 \"$url\" 2>/dev/null || echo \"000\")\n DOMAIN=$(echo \"$url\" | sed 's|https://||' | cut -d/ -f1)\n if [ \"$HTTP_CODE\" -ge 200 ] && [ \"$HTTP_CODE\" -lt 400 ]; then\n echo \" ✅ $DOMAIN → HTTP $HTTP_CODE\"\n elif [ \"$HTTP_CODE\" = \"000\" ]; then\n echo \" ❌ $DOMAIN → TIMEOUT/UNREACHABLE\"\n else\n echo \" ⚠️ $DOMAIN → HTTP $HTTP_CODE\"\n fi\ndone\necho \"\"\necho \"🔌 MCP Server Tool Count:\"\nTOOL_RESP=$(curl -sf --max-time 15 -X POST \\\n -H \"Content-Type: application/json\" \\\n -d '{\"jsonrpc\":\"2.0\",\"id\":1,\"method\":\"tools/list\",\"params\":{}}' \\\n \"https://riksdag-regering-ai.onrender.com/mcp\" 2>/dev/null) || TOOL_RESP=\"\"\nif echo \"$TOOL_RESP\" | grep -q '\"tools\"'; then\n TOOL_COUNT=$(echo \"$TOOL_RESP\" | grep -o '\"name\"' | wc -l)\n echo \" ✅ riksdag-regering MCP: $TOOL_COUNT tools registered\"\nelse\n echo \" ❌ riksdag-regering MCP: No tools response (server may still be starting)\"\nfi\necho \"\"\necho \"═══════════════════════════════════════════\"\n" - # Repo memory git-based storage configuration from frontmatter processed below - - name: Clone repo-memory branch (default) - env: - GH_TOKEN: ${{ github.token }} - GITHUB_SERVER_URL: ${{ github.server_url }} - BRANCH_NAME: memory/news-generation - TARGET_REPO: ${{ github.repository }} - MEMORY_DIR: /tmp/gh-aw/repo-memory/default - CREATE_ORPHAN: true - run: bash "${RUNNER_TEMP}/gh-aw/actions/clone_repo_memory_branch.sh" - name: Configure Git credentials env: REPO_NAME: ${{ github.repository }} @@ -505,9 +481,9 @@ jobs: mkdir -p "${RUNNER_TEMP}/gh-aw/safeoutputs" mkdir -p /tmp/gh-aw/safeoutputs mkdir -p /tmp/gh-aw/mcp-logs/safeoutputs - cat > "${RUNNER_TEMP}/gh-aw/safeoutputs/config.json" << 'GH_AW_SAFE_OUTPUTS_CONFIG_9919f8b0904ec91c_EOF' - {"add_comment":{"max":1},"create_pull_request":{"draft":false,"expires":336,"labels":["agentic-news","analysis-data"],"max":1,"max_patch_size":4096,"protected_files":["package.json","bun.lockb","bunfig.toml","deno.json","deno.jsonc","deno.lock","global.json","NuGet.Config","Directory.Packages.props","mix.exs","mix.lock","go.mod","go.sum","stack.yaml","stack.yaml.lock","pom.xml","build.gradle","build.gradle.kts","settings.gradle","settings.gradle.kts","gradle.properties","package-lock.json","yarn.lock","pnpm-lock.yaml","npm-shrinkwrap.json","requirements.txt","Pipfile","Pipfile.lock","pyproject.toml","setup.py","setup.cfg","Gemfile","Gemfile.lock","uv.lock","CODEOWNERS","AGENTS.md","CLAUDE.md","GEMINI.md"],"protected_path_prefixes":[".github/",".agents/"]},"create_report_incomplete_issue":{},"dispatch_workflow":{"max":1,"workflow_files":{"news-translate":".lock.yml"},"workflows":["news-translate"]},"missing_data":{},"missing_tool":{},"noop":{"max":1,"report-as-issue":"true"},"push_repo_memory":{"memories":[{"dir":"/tmp/gh-aw/repo-memory/default","id":"default","max_file_count":50,"max_file_size":51200,"max_patch_size":51200}]},"report_incomplete":{}} - GH_AW_SAFE_OUTPUTS_CONFIG_9919f8b0904ec91c_EOF + cat > "${RUNNER_TEMP}/gh-aw/safeoutputs/config.json" << 'GH_AW_SAFE_OUTPUTS_CONFIG_316d30d36f31babd_EOF' + {"add_comment":{"max":1},"create_pull_request":{"draft":false,"expires":336,"labels":["agentic-news","analysis-data"],"max":1,"max_patch_size":4096,"protected_files":["package.json","bun.lockb","bunfig.toml","deno.json","deno.jsonc","deno.lock","global.json","NuGet.Config","Directory.Packages.props","mix.exs","mix.lock","go.mod","go.sum","stack.yaml","stack.yaml.lock","pom.xml","build.gradle","build.gradle.kts","settings.gradle","settings.gradle.kts","gradle.properties","package-lock.json","yarn.lock","pnpm-lock.yaml","npm-shrinkwrap.json","requirements.txt","Pipfile","Pipfile.lock","pyproject.toml","setup.py","setup.cfg","Gemfile","Gemfile.lock","uv.lock","CODEOWNERS","AGENTS.md","CLAUDE.md","GEMINI.md"],"protected_path_prefixes":[".github/",".agents/"]},"create_report_incomplete_issue":{},"dispatch_workflow":{"max":1,"workflow_files":{"news-translate":".lock.yml"},"workflows":["news-translate"]},"missing_data":{},"missing_tool":{},"noop":{"max":1,"report-as-issue":"true"},"report_incomplete":{}} + GH_AW_SAFE_OUTPUTS_CONFIG_316d30d36f31babd_EOF - name: Write Safe Outputs Tools env: GH_AW_TOOLS_META_JSON: | @@ -775,7 +751,7 @@ jobs: mkdir -p /home/runner/.copilot GH_AW_NODE=$(which node 2>/dev/null || command -v node 2>/dev/null || echo node) - cat << GH_AW_MCP_CONFIG_858077292f059c87_EOF | "$GH_AW_NODE" "${RUNNER_TEMP}/gh-aw/actions/start_mcp_gateway.cjs" + cat << GH_AW_MCP_CONFIG_91c3125f0b9f24b6_EOF | "$GH_AW_NODE" "${RUNNER_TEMP}/gh-aw/actions/start_mcp_gateway.cjs" { "mcpServers": { "agenticworkflows": { @@ -902,10 +878,11 @@ jobs: "port": $MCP_GATEWAY_PORT, "domain": "${MCP_GATEWAY_DOMAIN}", "apiKey": "${MCP_GATEWAY_API_KEY}", - "payloadDir": "${MCP_GATEWAY_PAYLOAD_DIR}" + "payloadDir": "${MCP_GATEWAY_PAYLOAD_DIR}", + "keepaliveInterval": 300 } } - GH_AW_MCP_CONFIG_858077292f059c87_EOF + GH_AW_MCP_CONFIG_91c3125f0b9f24b6_EOF - name: Download activation artifact uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1 with: @@ -923,7 +900,7 @@ jobs: - name: Execute GitHub Copilot CLI id: agentic_execution # Copilot CLI tool arguments (sorted): - timeout-minutes: 60 + timeout-minutes: 55 run: | set -o pipefail touch /tmp/gh-aw/agent-step-summary.md @@ -1081,15 +1058,6 @@ jobs: if [ ! -f /tmp/gh-aw/agent_output.json ]; then echo '{"items":[]}' > /tmp/gh-aw/agent_output.json fi - # Upload repo memory as artifacts for push job - - name: Upload repo-memory artifact (default) - if: always() - uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1 - with: - name: repo-memory-default - path: /tmp/gh-aw/repo-memory/default - retention-days: 1 - if-no-files-found: ignore - name: Upload agent artifacts if: always() continue-on-error: true @@ -1118,7 +1086,6 @@ jobs: - activation - agent - detection - - push_repo_memory - safe_outputs if: > always() && (needs.agent.result != 'skipped' || needs.activation.outputs.lockdown_check_failed == 'true' || @@ -1243,13 +1210,9 @@ jobs: GH_AW_CODE_PUSH_FAILURE_COUNT: ${{ needs.safe_outputs.outputs.code_push_failure_count }} GH_AW_LOCKDOWN_CHECK_FAILED: ${{ needs.activation.outputs.lockdown_check_failed }} GH_AW_STALE_LOCK_FILE_FAILED: ${{ needs.activation.outputs.stale_lock_file_failed }} - GH_AW_PUSH_REPO_MEMORY_RESULT: ${{ needs.push_repo_memory.result }} - GH_AW_REPO_MEMORY_VALIDATION_FAILED_default: ${{ needs.push_repo_memory.outputs.validation_failed_default }} - GH_AW_REPO_MEMORY_VALIDATION_ERROR_default: ${{ needs.push_repo_memory.outputs.validation_error_default }} - GH_AW_REPO_MEMORY_PATCH_SIZE_EXCEEDED_default: ${{ needs.push_repo_memory.outputs.patch_size_exceeded_default }} GH_AW_GROUP_REPORTS: "false" GH_AW_FAILURE_REPORT_AS_ISSUE: "true" - GH_AW_TIMEOUT_MINUTES: "60" + GH_AW_TIMEOUT_MINUTES: "55" with: github-token: ${{ secrets.GH_AW_GITHUB_TOKEN || secrets.GITHUB_TOKEN }} script: | @@ -1418,79 +1381,6 @@ jobs: const { main } = require('${{ runner.temp }}/gh-aw/actions/parse_threat_detection_results.cjs'); await main(); - push_repo_memory: - needs: - - activation - - agent - - detection - if: > - always() && (!cancelled()) && (needs.detection.result == 'success' || needs.detection.result == 'skipped') && - needs.agent.result != 'skipped' - runs-on: ubuntu-slim - permissions: - contents: write - concurrency: - group: "push-repo-memory-${{ github.repository }}|memory/news-generation" - cancel-in-progress: false - outputs: - patch_size_exceeded_default: ${{ steps.push_repo_memory_default.outputs.patch_size_exceeded }} - validation_error_default: ${{ steps.push_repo_memory_default.outputs.validation_error }} - validation_failed_default: ${{ steps.push_repo_memory_default.outputs.validation_failed }} - steps: - - name: Setup Scripts - id: setup - uses: github/gh-aw-actions/setup@006ffd856b868b71df342dbe0ba082a963249b31 # v0.69.3 - with: - destination: ${{ runner.temp }}/gh-aw/actions - job-name: ${{ github.job }} - trace-id: ${{ needs.activation.outputs.setup-trace-id }} - - name: Checkout repository - uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2 - with: - persist-credentials: false - sparse-checkout: . - - name: Configure Git credentials - env: - REPO_NAME: ${{ github.repository }} - SERVER_URL: ${{ github.server_url }} - GITHUB_TOKEN: ${{ github.token }} - run: | - git config --global user.email "github-actions[bot]@users.noreply.github.com" - git config --global user.name "github-actions[bot]" - git config --global am.keepcr true - # Re-authenticate git with GitHub token - SERVER_URL_STRIPPED="${SERVER_URL#https://}" - git remote set-url origin "https://x-access-token:${GITHUB_TOKEN}@${SERVER_URL_STRIPPED}/${REPO_NAME}.git" - echo "Git configured with standard GitHub Actions identity" - - name: Download repo-memory artifact (default) - uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1 - continue-on-error: true - with: - name: repo-memory-default - path: /tmp/gh-aw/repo-memory/default - - name: Push repo-memory changes (default) - id: push_repo_memory_default - if: always() - uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 - env: - GH_TOKEN: ${{ github.token }} - GITHUB_RUN_ID: ${{ github.run_id }} - GITHUB_SERVER_URL: ${{ github.server_url }} - ARTIFACT_DIR: /tmp/gh-aw/repo-memory/default - MEMORY_ID: default - TARGET_REPO: ${{ github.repository }} - BRANCH_NAME: memory/news-generation - MAX_FILE_SIZE: 51200 - MAX_FILE_COUNT: 50 - MAX_PATCH_SIZE: 51200 - ALLOWED_EXTENSIONS: '[".md",".json"]' - with: - script: | - const { setupGlobals } = require('${{ runner.temp }}/gh-aw/actions/setup_globals.cjs'); - setupGlobals(core, github, context, exec, io, getOctokit); - const { main } = require('${{ runner.temp }}/gh-aw/actions/push_repo_memory.cjs'); - await main(); - safe_outputs: needs: - activation diff --git a/.github/workflows/news-evening-analysis.md b/.github/workflows/news-evening-analysis.md index 23d395dce..4e7e88aa1 100644 --- a/.github/workflows/news-evening-analysis.md +++ b/.github/workflows/news-evening-analysis.md @@ -48,12 +48,21 @@ permissions: discussions: read security-events: read -timeout-minutes: 60 +timeout-minutes: 55 concurrency: group: gh-aw-news-evening-analysis-${{ inputs.article_date || 'today' }} cancel-in-progress: false +features: + mcp-gateway: true + +sandbox: + agent: awf + mcp: + port: 8080 + keepalive-interval: 300 # 5m ping to keep MCP connections alive; Copilot API token expires ~60min so PR must be created within 25min of agent start + runtimes: node: version: "25" @@ -106,12 +115,6 @@ tools: agentic-workflows: true bash: true playwright: - repo-memory: - branch-name: memory/news-generation - allowed-extensions: [".md", ".json"] - max-file-size: 51200 - max-file-count: 50 - max-patch-size: 51200 safe-outputs: allowed-domains: @@ -241,38 +244,47 @@ This workflow imports `../prompts/ext/tier-c-aggregation.md`. Produce **all 14 a - **Article type**: `evening-analysis` - **Analysis subfolder**: `analysis/daily/$ARTICLE_DATE/evening-analysis/` - **Core languages produced**: `en`, `sv` (remaining 12 languages dispatched to `news-translate`) -- **One pull request per run** containing analysis + articles + visualisation data. +- **Two-run model**: Run 1 produces an `analysis-only` PR (14 artifacts); Run 2 (next scheduled run, same day) detects existing analysis and produces an articles PR. + +## Time budget + +**Run 1 — Analysis mode** (no prior analysis found, ~45 min): + +| Minutes | Phase | Module | +|---------|-------|--------| +| 0–2 | MCP pre-warm + pre-flight analysis check | 02 / 03 | +| 2–7 | Download data + catalogue | 03 | +| 7–27 | Analysis Pass 1 (methodology read + per-doc analyses + 14 artifacts incl. 5 Tier-C) | 04 / ext | +| 27–38 | Analysis Pass 2 (read-back + improvements) | 04 | +| 38–41 | Analysis Gate (Tier-C extended gate) | 05 | +| 41–45 | Stage analysis, commit, **ONE** `safeoutputs___create_pull_request` (analysis-only) | 07 | -## Time budget (60 min, minimum 45 min of real work) +**Run 2 — Article mode** (analysis exists on disk, ~25 min): | Minutes | Phase | Module | |---------|-------|--------| -| 0–2 | MCP pre-warm + `get_sync_status` | 02 | -| 2–6 | Download data + catalogue | 03 | -| 6–25 | Analysis Pass 1 (methodology read + per-doc analyses + 9 artifacts) | 04 | -| 25–35 | Analysis Pass 2 (read-back + improvements) | 04 | -| 35–37 | Analysis Gate | 05 | -| 37–48 | Article Pass 1 + Pass 2 (EN, SV) | 06 | -| 48–55 | Visual + link validation | 06 | -| 55–60 | Stage, commit, **ONE** `safeoutputs___create_pull_request` | 07 | +| 0–2 | MCP pre-warm + pre-flight check (SKIP_ANALYSIS=true) | 02 / 03 | +| 2–5 | Read all 14 analysis artifacts into context | 06 | +| 5–18 | Article Pass 1 + Pass 2 (EN, SV) | 06 | +| 18–22 | Visual + link validation | 06 | +| 22–25 | Stage articles, commit, **ONE** `safeoutputs___create_pull_request` | 07 | -Trim scope before quality. Never open a second PR to "save" partial work — there is no second PR. +Trim scope before quality. Never open a second PR within a run — there is no second PR. ## Inputs - `article_date` — override date (defaults to today) -- `force_generation` — regenerate even if today's article exists (analysis is always refreshed regardless) +- `force_generation` — regenerate even if today's article exists; also forces analysis re-run - `languages` — core content languages (default `en,sv`) - `analysis_depth` — `standard` | `deep` (default) | `comprehensive` -## Dedup & analysis-only path +## Run-mode selection -If articles for `$ARTICLE_DATE` + `evening-analysis` already exist **and** `force_generation=false`: +At the start of every run, the pre-flight check in `03-data-download.md` detects whether `analysis/daily/$ARTICLE_DATE/evening-analysis/` already contains all 9 core artifacts: -- Still run the full analysis pipeline (modules 03 → 04 → 05). -- Commit the analysis. -- Open the single PR with title `📊 Analysis Only — Evening Analysis — $ARTICLE_DATE` and label `analysis-only`. +- **No analysis found** → Analysis mode: download data, run Pass 1 + Pass 2 + Tier-C Gate (14 artifacts), commit analysis artifacts, open `analysis-only` PR, stop. +- **Analysis found** → Article mode: read existing analysis, generate articles, commit articles, open articles PR + dispatch `news-translate`. -Analysis is the primary product — a run never "does nothing" just because articles exist. +Repeated runs for the same `$ARTICLE_DATE` always use the same analysis folder when `force_generation=false`. Analysis is the primary product — a run never produces nothing. All other rules (bash format, AWF shell safety, MCP access, download pipeline, analysis methodology & gate, article generation, commit & PR policy) live in the imported modules. diff --git a/.github/workflows/news-interpellations.lock.yml b/.github/workflows/news-interpellations.lock.yml index aa263fbb0..4c6f0fe75 100644 --- a/.github/workflows/news-interpellations.lock.yml +++ b/.github/workflows/news-interpellations.lock.yml @@ -1,4 +1,4 @@ -# gh-aw-metadata: {"schema_version":"v3","frontmatter_hash":"d1b5eeff6d85f73e5e2f3a27c545b9db288de3dd45c2ff7fb2b07a66cf60bc33","compiler_version":"v0.69.3","agent_id":"copilot","agent_model":"claude-opus-4.7"} +# gh-aw-metadata: {"schema_version":"v3","frontmatter_hash":"b551dad14c26f792101f1b6152f5ab328bc5a0e0c77673525d68497f8020e14b","compiler_version":"v0.69.3","agent_id":"copilot","agent_model":"claude-opus-4.7"} # gh-aw-manifest: {"version":1,"secrets":["COPILOT_GITHUB_TOKEN","GH_AW_CI_TRIGGER_TOKEN","GH_AW_GITHUB_MCP_SERVER_TOKEN","GH_AW_GITHUB_TOKEN","GITHUB_TOKEN"],"actions":[{"repo":"actions/checkout","sha":"de0fac2e4500dabe0009e67214ff5f5447ce83dd","version":"v6.0.2"},{"repo":"actions/download-artifact","sha":"3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c","version":"v8.0.1"},{"repo":"actions/github-script","sha":"373c709c69115d41ff229c7e5df9f8788daa9553","version":"v9"},{"repo":"actions/setup-node","sha":"6044e13b5dc448c55e2357c09f80417699197238","version":"6044e13b5dc448c55e2357c09f80417699197238"},{"repo":"actions/upload-artifact","sha":"043fb46d1a93c77aae656e7c1c64a875d1fc6a0a","version":"v7.0.1"},{"repo":"github/gh-aw-actions/setup","sha":"006ffd856b868b71df342dbe0ba082a963249b31","version":"v0.69.3"}],"containers":[{"image":"alpine:latest"},{"image":"ghcr.io/github/gh-aw-firewall/agent:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/api-proxy:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/squid:0.25.26"},{"image":"ghcr.io/github/gh-aw-mcpg:v0.2.26"},{"image":"ghcr.io/github/github-mcp-server:v1.0.0"},{"image":"node:25-alpine"},{"image":"node:lts-alpine"}]} # ___ _ _ # / _ \ | | (_) @@ -203,26 +203,24 @@ jobs: GH_AW_GITHUB_REPOSITORY: ${{ github.repository }} GH_AW_GITHUB_RUN_ID: ${{ github.run_id }} GH_AW_GITHUB_WORKSPACE: ${{ github.workspace }} - GH_AW_WIKI_NOTE: ${{ '' }} # poutine:ignore untrusted_checkout_exec run: | bash "${RUNNER_TEMP}/gh-aw/actions/create_prompt_first.sh" { - cat << 'GH_AW_PROMPT_56ceca6b6602419b_EOF' + cat << 'GH_AW_PROMPT_9d0a6b1fe1bb8640_EOF' - GH_AW_PROMPT_56ceca6b6602419b_EOF + GH_AW_PROMPT_9d0a6b1fe1bb8640_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/xpia.md" cat "${RUNNER_TEMP}/gh-aw/prompts/temp_folder_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/markdown.md" cat "${RUNNER_TEMP}/gh-aw/prompts/agentic_workflows_guide.md" - cat "${RUNNER_TEMP}/gh-aw/prompts/repo_memory_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/safe_outputs_prompt.md" - cat << 'GH_AW_PROMPT_56ceca6b6602419b_EOF' + cat << 'GH_AW_PROMPT_9d0a6b1fe1bb8640_EOF' Tools: add_comment, create_pull_request, dispatch_workflow, missing_tool, missing_data, noop - GH_AW_PROMPT_56ceca6b6602419b_EOF + GH_AW_PROMPT_9d0a6b1fe1bb8640_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/safe_outputs_create_pull_request.md" - cat << 'GH_AW_PROMPT_56ceca6b6602419b_EOF' + cat << 'GH_AW_PROMPT_9d0a6b1fe1bb8640_EOF' The following GitHub context information is available for this workflow: @@ -252,9 +250,9 @@ jobs: {{/if}} - GH_AW_PROMPT_56ceca6b6602419b_EOF + GH_AW_PROMPT_9d0a6b1fe1bb8640_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/github_mcp_tools_with_safeoutputs_prompt.md" - cat << 'GH_AW_PROMPT_56ceca6b6602419b_EOF' + cat << 'GH_AW_PROMPT_9d0a6b1fe1bb8640_EOF' {{#runtime-import .github/prompts/00-base-contract.md}} {{#runtime-import .github/prompts/01-bash-and-shell-safety.md}} @@ -265,7 +263,7 @@ jobs: {{#runtime-import .github/prompts/06-article-generation.md}} {{#runtime-import .github/prompts/07-commit-and-pr.md}} {{#runtime-import .github/workflows/news-interpellations.md}} - GH_AW_PROMPT_56ceca6b6602419b_EOF + GH_AW_PROMPT_9d0a6b1fe1bb8640_EOF } > "$GH_AW_PROMPT" - name: Interpolate variables and render templates uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 @@ -289,12 +287,6 @@ jobs: GH_AW_GITHUB_REPOSITORY: ${{ github.repository }} GH_AW_GITHUB_RUN_ID: ${{ github.run_id }} GH_AW_GITHUB_WORKSPACE: ${{ github.workspace }} - GH_AW_MEMORY_BRANCH_NAME: 'memory/news-generation' - GH_AW_MEMORY_CONSTRAINTS: "\n\n**Constraints:**\n- **Max File Size**: 51200 bytes (0.05 MB) per file\n- **Max File Count**: 50 files per commit\n- **Max Patch Size**: 51200 bytes (50 KB) total per push (max: 100 KB)\n" - GH_AW_MEMORY_DESCRIPTION: '' - GH_AW_MEMORY_DIR: '/tmp/gh-aw/repo-memory/default/' - GH_AW_MEMORY_TARGET_REPO: ' of the current repository' - GH_AW_WIKI_NOTE: '' with: script: | const { setupGlobals } = require('${{ runner.temp }}/gh-aw/actions/setup_globals.cjs'); @@ -313,13 +305,7 @@ jobs: GH_AW_GITHUB_EVENT_PULL_REQUEST_NUMBER: process.env.GH_AW_GITHUB_EVENT_PULL_REQUEST_NUMBER, GH_AW_GITHUB_REPOSITORY: process.env.GH_AW_GITHUB_REPOSITORY, GH_AW_GITHUB_RUN_ID: process.env.GH_AW_GITHUB_RUN_ID, - GH_AW_GITHUB_WORKSPACE: process.env.GH_AW_GITHUB_WORKSPACE, - GH_AW_MEMORY_BRANCH_NAME: process.env.GH_AW_MEMORY_BRANCH_NAME, - GH_AW_MEMORY_CONSTRAINTS: process.env.GH_AW_MEMORY_CONSTRAINTS, - GH_AW_MEMORY_DESCRIPTION: process.env.GH_AW_MEMORY_DESCRIPTION, - GH_AW_MEMORY_DIR: process.env.GH_AW_MEMORY_DIR, - GH_AW_MEMORY_TARGET_REPO: process.env.GH_AW_MEMORY_TARGET_REPO, - GH_AW_WIKI_NOTE: process.env.GH_AW_WIKI_NOTE + GH_AW_GITHUB_WORKSPACE: process.env.GH_AW_GITHUB_WORKSPACE } }); - name: Validate prompt placeholders @@ -414,16 +400,6 @@ jobs: - name: Pre-flight external endpoint reachability check (runs before MCP Gateway) run: "echo \"🔍 Network Diagnostics — $(date -u '+%Y-%m-%dT%H:%M:%SZ')\"\necho \"═══════════════════════════════════════════\"\necho \"\"\necho \"📡 DNS Resolution Tests:\"\nfor domain in riksdag-regering-ai.onrender.com api.scb.se api.worldbank.org data.riksdagen.se www.riksdagen.se www.regeringen.se; do\n if nslookup \"$domain\" >/dev/null 2>&1; then\n IP=$(nslookup \"$domain\" 2>/dev/null | grep -A1 \"Name:\" | grep \"Address:\" | head -1 | awk '{print $2}')\n echo \" ✅ $domain → $IP\"\n else\n echo \" ❌ $domain — DNS FAILED\"\n fi\ndone\necho \"\"\necho \"🌐 HTTPS Connectivity Tests:\"\nfor url in \\\n \"https://riksdag-regering-ai.onrender.com/mcp\" \\\n \"https://api.scb.se/OV0104/v2beta\" \\\n \"https://api.worldbank.org/v2/country/SE?format=json\" \\\n \"https://data.riksdagen.se/dokumentlista/?sok=test&doktyp=bet&utformat=json&a=1\" \\\n; do\n HTTP_CODE=$(curl -s -o /dev/null -w \"%{http_code}\" --max-time 10 \"$url\" 2>/dev/null || echo \"000\")\n DOMAIN=$(echo \"$url\" | sed 's|https://||' | cut -d/ -f1)\n if [ \"$HTTP_CODE\" -ge 200 ] && [ \"$HTTP_CODE\" -lt 400 ]; then\n echo \" ✅ $DOMAIN → HTTP $HTTP_CODE\"\n elif [ \"$HTTP_CODE\" = \"000\" ]; then\n echo \" ❌ $DOMAIN → TIMEOUT/UNREACHABLE\"\n else\n echo \" ⚠️ $DOMAIN → HTTP $HTTP_CODE\"\n fi\ndone\necho \"\"\necho \"🔌 MCP Server Tool Count:\"\nTOOL_RESP=$(curl -sf --max-time 15 -X POST \\\n -H \"Content-Type: application/json\" \\\n -d '{\"jsonrpc\":\"2.0\",\"id\":1,\"method\":\"tools/list\",\"params\":{}}' \\\n \"https://riksdag-regering-ai.onrender.com/mcp\" 2>/dev/null) || TOOL_RESP=\"\"\nif echo \"$TOOL_RESP\" | grep -q '\"tools\"'; then\n TOOL_COUNT=$(echo \"$TOOL_RESP\" | grep -o '\"name\"' | wc -l)\n echo \" ✅ riksdag-regering MCP: $TOOL_COUNT tools registered\"\nelse\n echo \" ❌ riksdag-regering MCP: No tools response (server may still be starting)\"\nfi\necho \"\"\necho \"═══════════════════════════════════════════\"\n" - # Repo memory git-based storage configuration from frontmatter processed below - - name: Clone repo-memory branch (default) - env: - GH_TOKEN: ${{ github.token }} - GITHUB_SERVER_URL: ${{ github.server_url }} - BRANCH_NAME: memory/news-generation - TARGET_REPO: ${{ github.repository }} - MEMORY_DIR: /tmp/gh-aw/repo-memory/default - CREATE_ORPHAN: true - run: bash "${RUNNER_TEMP}/gh-aw/actions/clone_repo_memory_branch.sh" - name: Configure Git credentials env: REPO_NAME: ${{ github.repository }} @@ -498,9 +474,9 @@ jobs: mkdir -p "${RUNNER_TEMP}/gh-aw/safeoutputs" mkdir -p /tmp/gh-aw/safeoutputs mkdir -p /tmp/gh-aw/mcp-logs/safeoutputs - cat > "${RUNNER_TEMP}/gh-aw/safeoutputs/config.json" << 'GH_AW_SAFE_OUTPUTS_CONFIG_591ccf46f42b73cb_EOF' - {"add_comment":{"max":1},"create_pull_request":{"draft":false,"expires":336,"labels":["agentic-news","analysis-data"],"max":1,"max_patch_size":4096,"protected_files":["package.json","bun.lockb","bunfig.toml","deno.json","deno.jsonc","deno.lock","global.json","NuGet.Config","Directory.Packages.props","mix.exs","mix.lock","go.mod","go.sum","stack.yaml","stack.yaml.lock","pom.xml","build.gradle","build.gradle.kts","settings.gradle","settings.gradle.kts","gradle.properties","package-lock.json","yarn.lock","pnpm-lock.yaml","npm-shrinkwrap.json","requirements.txt","Pipfile","Pipfile.lock","pyproject.toml","setup.py","setup.cfg","Gemfile","Gemfile.lock","uv.lock","CODEOWNERS","AGENTS.md","CLAUDE.md","GEMINI.md"],"protected_path_prefixes":[".github/",".agents/"]},"create_report_incomplete_issue":{},"dispatch_workflow":{"max":1,"workflow_files":{"news-translate":".lock.yml"},"workflows":["news-translate"]},"missing_data":{},"missing_tool":{},"noop":{"max":1,"report-as-issue":"true"},"push_repo_memory":{"memories":[{"dir":"/tmp/gh-aw/repo-memory/default","id":"default","max_file_count":50,"max_file_size":51200,"max_patch_size":51200}]},"report_incomplete":{}} - GH_AW_SAFE_OUTPUTS_CONFIG_591ccf46f42b73cb_EOF + cat > "${RUNNER_TEMP}/gh-aw/safeoutputs/config.json" << 'GH_AW_SAFE_OUTPUTS_CONFIG_722e52955bc3598d_EOF' + {"add_comment":{"max":1},"create_pull_request":{"draft":false,"expires":336,"labels":["agentic-news","analysis-data"],"max":1,"max_patch_size":4096,"protected_files":["package.json","bun.lockb","bunfig.toml","deno.json","deno.jsonc","deno.lock","global.json","NuGet.Config","Directory.Packages.props","mix.exs","mix.lock","go.mod","go.sum","stack.yaml","stack.yaml.lock","pom.xml","build.gradle","build.gradle.kts","settings.gradle","settings.gradle.kts","gradle.properties","package-lock.json","yarn.lock","pnpm-lock.yaml","npm-shrinkwrap.json","requirements.txt","Pipfile","Pipfile.lock","pyproject.toml","setup.py","setup.cfg","Gemfile","Gemfile.lock","uv.lock","CODEOWNERS","AGENTS.md","CLAUDE.md","GEMINI.md"],"protected_path_prefixes":[".github/",".agents/"]},"create_report_incomplete_issue":{},"dispatch_workflow":{"max":1,"workflow_files":{"news-translate":".lock.yml"},"workflows":["news-translate"]},"missing_data":{},"missing_tool":{},"noop":{"max":1,"report-as-issue":"true"},"report_incomplete":{}} + GH_AW_SAFE_OUTPUTS_CONFIG_722e52955bc3598d_EOF - name: Write Safe Outputs Tools env: GH_AW_TOOLS_META_JSON: | @@ -766,7 +742,7 @@ jobs: mkdir -p /home/runner/.copilot GH_AW_NODE=$(which node 2>/dev/null || command -v node 2>/dev/null || echo node) - cat << GH_AW_MCP_CONFIG_58fe3c2ad85e2bbc_EOF | "$GH_AW_NODE" "${RUNNER_TEMP}/gh-aw/actions/start_mcp_gateway.cjs" + cat << GH_AW_MCP_CONFIG_ccc2c87bdeac2dc8_EOF | "$GH_AW_NODE" "${RUNNER_TEMP}/gh-aw/actions/start_mcp_gateway.cjs" { "mcpServers": { "agenticworkflows": { @@ -879,10 +855,11 @@ jobs: "port": $MCP_GATEWAY_PORT, "domain": "${MCP_GATEWAY_DOMAIN}", "apiKey": "${MCP_GATEWAY_API_KEY}", - "payloadDir": "${MCP_GATEWAY_PAYLOAD_DIR}" + "payloadDir": "${MCP_GATEWAY_PAYLOAD_DIR}", + "keepaliveInterval": 300 } } - GH_AW_MCP_CONFIG_58fe3c2ad85e2bbc_EOF + GH_AW_MCP_CONFIG_ccc2c87bdeac2dc8_EOF - name: Download activation artifact uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1 with: @@ -900,7 +877,7 @@ jobs: - name: Execute GitHub Copilot CLI id: agentic_execution # Copilot CLI tool arguments (sorted): - timeout-minutes: 60 + timeout-minutes: 55 run: | set -o pipefail touch /tmp/gh-aw/agent-step-summary.md @@ -1058,15 +1035,6 @@ jobs: if [ ! -f /tmp/gh-aw/agent_output.json ]; then echo '{"items":[]}' > /tmp/gh-aw/agent_output.json fi - # Upload repo memory as artifacts for push job - - name: Upload repo-memory artifact (default) - if: always() - uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1 - with: - name: repo-memory-default - path: /tmp/gh-aw/repo-memory/default - retention-days: 1 - if-no-files-found: ignore - name: Upload agent artifacts if: always() continue-on-error: true @@ -1095,7 +1063,6 @@ jobs: - activation - agent - detection - - push_repo_memory - safe_outputs if: > always() && (needs.agent.result != 'skipped' || needs.activation.outputs.lockdown_check_failed == 'true' || @@ -1220,13 +1187,9 @@ jobs: GH_AW_CODE_PUSH_FAILURE_COUNT: ${{ needs.safe_outputs.outputs.code_push_failure_count }} GH_AW_LOCKDOWN_CHECK_FAILED: ${{ needs.activation.outputs.lockdown_check_failed }} GH_AW_STALE_LOCK_FILE_FAILED: ${{ needs.activation.outputs.stale_lock_file_failed }} - GH_AW_PUSH_REPO_MEMORY_RESULT: ${{ needs.push_repo_memory.result }} - GH_AW_REPO_MEMORY_VALIDATION_FAILED_default: ${{ needs.push_repo_memory.outputs.validation_failed_default }} - GH_AW_REPO_MEMORY_VALIDATION_ERROR_default: ${{ needs.push_repo_memory.outputs.validation_error_default }} - GH_AW_REPO_MEMORY_PATCH_SIZE_EXCEEDED_default: ${{ needs.push_repo_memory.outputs.patch_size_exceeded_default }} GH_AW_GROUP_REPORTS: "false" GH_AW_FAILURE_REPORT_AS_ISSUE: "true" - GH_AW_TIMEOUT_MINUTES: "60" + GH_AW_TIMEOUT_MINUTES: "55" with: github-token: ${{ secrets.GH_AW_GITHUB_TOKEN || secrets.GITHUB_TOKEN }} script: | @@ -1395,79 +1358,6 @@ jobs: const { main } = require('${{ runner.temp }}/gh-aw/actions/parse_threat_detection_results.cjs'); await main(); - push_repo_memory: - needs: - - activation - - agent - - detection - if: > - always() && (!cancelled()) && (needs.detection.result == 'success' || needs.detection.result == 'skipped') && - needs.agent.result != 'skipped' - runs-on: ubuntu-slim - permissions: - contents: write - concurrency: - group: "push-repo-memory-${{ github.repository }}|memory/news-generation" - cancel-in-progress: false - outputs: - patch_size_exceeded_default: ${{ steps.push_repo_memory_default.outputs.patch_size_exceeded }} - validation_error_default: ${{ steps.push_repo_memory_default.outputs.validation_error }} - validation_failed_default: ${{ steps.push_repo_memory_default.outputs.validation_failed }} - steps: - - name: Setup Scripts - id: setup - uses: github/gh-aw-actions/setup@006ffd856b868b71df342dbe0ba082a963249b31 # v0.69.3 - with: - destination: ${{ runner.temp }}/gh-aw/actions - job-name: ${{ github.job }} - trace-id: ${{ needs.activation.outputs.setup-trace-id }} - - name: Checkout repository - uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2 - with: - persist-credentials: false - sparse-checkout: . - - name: Configure Git credentials - env: - REPO_NAME: ${{ github.repository }} - SERVER_URL: ${{ github.server_url }} - GITHUB_TOKEN: ${{ github.token }} - run: | - git config --global user.email "github-actions[bot]@users.noreply.github.com" - git config --global user.name "github-actions[bot]" - git config --global am.keepcr true - # Re-authenticate git with GitHub token - SERVER_URL_STRIPPED="${SERVER_URL#https://}" - git remote set-url origin "https://x-access-token:${GITHUB_TOKEN}@${SERVER_URL_STRIPPED}/${REPO_NAME}.git" - echo "Git configured with standard GitHub Actions identity" - - name: Download repo-memory artifact (default) - uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1 - continue-on-error: true - with: - name: repo-memory-default - path: /tmp/gh-aw/repo-memory/default - - name: Push repo-memory changes (default) - id: push_repo_memory_default - if: always() - uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 - env: - GH_TOKEN: ${{ github.token }} - GITHUB_RUN_ID: ${{ github.run_id }} - GITHUB_SERVER_URL: ${{ github.server_url }} - ARTIFACT_DIR: /tmp/gh-aw/repo-memory/default - MEMORY_ID: default - TARGET_REPO: ${{ github.repository }} - BRANCH_NAME: memory/news-generation - MAX_FILE_SIZE: 51200 - MAX_FILE_COUNT: 50 - MAX_PATCH_SIZE: 51200 - ALLOWED_EXTENSIONS: '[".md",".json"]' - with: - script: | - const { setupGlobals } = require('${{ runner.temp }}/gh-aw/actions/setup_globals.cjs'); - setupGlobals(core, github, context, exec, io, getOctokit); - const { main } = require('${{ runner.temp }}/gh-aw/actions/push_repo_memory.cjs'); - await main(); - safe_outputs: needs: - activation diff --git a/.github/workflows/news-interpellations.md b/.github/workflows/news-interpellations.md index b5799d07c..f3dc8ae70 100644 --- a/.github/workflows/news-interpellations.md +++ b/.github/workflows/news-interpellations.md @@ -40,12 +40,21 @@ permissions: discussions: read security-events: read -timeout-minutes: 60 +timeout-minutes: 55 concurrency: group: gh-aw-news-interpellations-${{ inputs.article_date || 'today' }} cancel-in-progress: false +features: + mcp-gateway: true + +sandbox: + agent: awf + mcp: + port: 8080 + keepalive-interval: 300 # 5m ping to keep MCP connections alive; Copilot API token expires ~60min so PR must be created within 25min of agent start + runtimes: node: version: "25" @@ -97,12 +106,6 @@ tools: - all agentic-workflows: true bash: true - repo-memory: - branch-name: memory/news-generation - allowed-extensions: [".md", ".json"] - max-file-size: 51200 - max-file-count: 50 - max-patch-size: 51200 safe-outputs: allowed-domains: @@ -228,38 +231,47 @@ Generates deep political intelligence articles on interpellation debates, includ - **Article type**: `interpellations` - **Analysis subfolder**: `analysis/daily/$ARTICLE_DATE/interpellations/` - **Core languages produced**: `en`, `sv` (remaining 12 languages dispatched to `news-translate`) -- **One pull request per run** containing analysis + articles + visualisation data. +- **Two-run model**: Run 1 produces an `analysis-only` PR; Run 2 (next scheduled run, same day) detects existing analysis and produces an articles PR. + +## Time budget + +**Run 1 — Analysis mode** (no prior analysis found, ~43 min): + +| Minutes | Phase | Module | +|---------|-------|--------| +| 0–2 | MCP pre-warm + pre-flight analysis check | 02 / 03 | +| 2–7 | Download data + catalogue | 03 | +| 7–27 | Analysis Pass 1 (methodology read + per-doc analyses + 9 artifacts) | 04 | +| 27–38 | Analysis Pass 2 (read-back + improvements) | 04 | +| 38–40 | Analysis Gate | 05 | +| 40–43 | Stage analysis, commit, **ONE** `safeoutputs___create_pull_request` (analysis-only) | 07 | -## Time budget (60 min, minimum 45 min of real work) +**Run 2 — Article mode** (analysis exists on disk, ~25 min): | Minutes | Phase | Module | |---------|-------|--------| -| 0–2 | MCP pre-warm + `get_sync_status` | 02 | -| 2–6 | Download data + catalogue | 03 | -| 6–25 | Analysis Pass 1 (methodology read + per-doc analyses + 9 artifacts) | 04 | -| 25–35 | Analysis Pass 2 (read-back + improvements) | 04 | -| 35–37 | Analysis Gate | 05 | -| 37–48 | Article Pass 1 + Pass 2 (EN, SV) | 06 | -| 48–55 | Visual + link validation | 06 | -| 55–60 | Stage, commit, **ONE** `safeoutputs___create_pull_request` | 07 | +| 0–2 | MCP pre-warm + pre-flight check (SKIP_ANALYSIS=true) | 02 / 03 | +| 2–5 | Read all 9 analysis artifacts into context | 06 | +| 5–18 | Article Pass 1 + Pass 2 (EN, SV) | 06 | +| 18–22 | Visual + link validation | 06 | +| 22–25 | Stage articles, commit, **ONE** `safeoutputs___create_pull_request` | 07 | -Trim scope before quality. Never open a second PR to "save" partial work — there is no second PR. +Trim scope before quality. Never open a second PR within a run — there is no second PR. ## Inputs - `article_date` — override date (defaults to today) -- `force_generation` — regenerate even if today's article exists (analysis is always refreshed regardless) +- `force_generation` — regenerate even if today's article exists; also forces analysis re-run - `languages` — core content languages (default `en,sv`) - `analysis_depth` — `standard` | `deep` (default) | `comprehensive` -## Dedup & analysis-only path +## Run-mode selection -If articles for `$ARTICLE_DATE` + `interpellations` already exist **and** `force_generation=false`: +At the start of every run, the pre-flight check in `03-data-download.md` detects whether `analysis/daily/$ARTICLE_DATE/interpellations/` already contains all 9 required artifacts: -- Still run the full analysis pipeline (modules 03 → 04 → 05). -- Commit the analysis. -- Open the single PR with title `📊 Analysis Only — Interpellation Debates — $ARTICLE_DATE` and label `analysis-only`. +- **No analysis found** → Analysis mode: download data, run Pass 1 + Pass 2 + Gate, commit analysis artifacts, open `analysis-only` PR, stop. +- **Analysis found** → Article mode: read existing analysis, generate articles, commit articles, open articles PR + dispatch `news-translate`. -Analysis is the primary product — a run never "does nothing" just because articles exist. +Repeated runs for the same `$ARTICLE_DATE` always use the same analysis folder when `force_generation=false`. Analysis is the primary product — a run never produces nothing. All other rules (bash format, AWF shell safety, MCP access, download pipeline, analysis methodology & gate, article generation, commit & PR policy) live in the imported modules. diff --git a/.github/workflows/news-month-ahead.lock.yml b/.github/workflows/news-month-ahead.lock.yml index 4485959a6..aae004621 100644 --- a/.github/workflows/news-month-ahead.lock.yml +++ b/.github/workflows/news-month-ahead.lock.yml @@ -1,4 +1,4 @@ -# gh-aw-metadata: {"schema_version":"v3","frontmatter_hash":"edc0474901ac6a1bac6847c0bd8635b6adbacbd4709e2109d31070a72a53068e","compiler_version":"v0.69.3","agent_id":"copilot","agent_model":"claude-opus-4.7"} +# gh-aw-metadata: {"schema_version":"v3","frontmatter_hash":"e433ab1675b75f490de7b45e7fcf22f9c4d462fb63d6fd5bfb3b39431b3f0843","compiler_version":"v0.69.3","agent_id":"copilot","agent_model":"claude-opus-4.7"} # gh-aw-manifest: {"version":1,"secrets":["COPILOT_GITHUB_TOKEN","GH_AW_CI_TRIGGER_TOKEN","GH_AW_GITHUB_MCP_SERVER_TOKEN","GH_AW_GITHUB_TOKEN","GITHUB_TOKEN"],"actions":[{"repo":"actions/checkout","sha":"de0fac2e4500dabe0009e67214ff5f5447ce83dd","version":"v6.0.2"},{"repo":"actions/download-artifact","sha":"3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c","version":"v8.0.1"},{"repo":"actions/github-script","sha":"373c709c69115d41ff229c7e5df9f8788daa9553","version":"v9"},{"repo":"actions/setup-node","sha":"6044e13b5dc448c55e2357c09f80417699197238","version":"6044e13b5dc448c55e2357c09f80417699197238"},{"repo":"actions/upload-artifact","sha":"043fb46d1a93c77aae656e7c1c64a875d1fc6a0a","version":"v7.0.1"},{"repo":"github/gh-aw-actions/setup","sha":"006ffd856b868b71df342dbe0ba082a963249b31","version":"v0.69.3"}],"containers":[{"image":"alpine:latest"},{"image":"ghcr.io/github/gh-aw-firewall/agent:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/api-proxy:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/squid:0.25.26"},{"image":"ghcr.io/github/gh-aw-mcpg:v0.2.26"},{"image":"ghcr.io/github/github-mcp-server:v1.0.0"},{"image":"node:25-alpine"},{"image":"node:lts-alpine"}]} # ___ _ _ # / _ \ | | (_) @@ -203,26 +203,24 @@ jobs: GH_AW_GITHUB_REPOSITORY: ${{ github.repository }} GH_AW_GITHUB_RUN_ID: ${{ github.run_id }} GH_AW_GITHUB_WORKSPACE: ${{ github.workspace }} - GH_AW_WIKI_NOTE: ${{ '' }} # poutine:ignore untrusted_checkout_exec run: | bash "${RUNNER_TEMP}/gh-aw/actions/create_prompt_first.sh" { - cat << 'GH_AW_PROMPT_d68ebdb73563ac48_EOF' + cat << 'GH_AW_PROMPT_09d424fe223a26d4_EOF' - GH_AW_PROMPT_d68ebdb73563ac48_EOF + GH_AW_PROMPT_09d424fe223a26d4_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/xpia.md" cat "${RUNNER_TEMP}/gh-aw/prompts/temp_folder_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/markdown.md" cat "${RUNNER_TEMP}/gh-aw/prompts/agentic_workflows_guide.md" - cat "${RUNNER_TEMP}/gh-aw/prompts/repo_memory_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/safe_outputs_prompt.md" - cat << 'GH_AW_PROMPT_d68ebdb73563ac48_EOF' + cat << 'GH_AW_PROMPT_09d424fe223a26d4_EOF' Tools: add_comment, create_pull_request, dispatch_workflow, missing_tool, missing_data, noop - GH_AW_PROMPT_d68ebdb73563ac48_EOF + GH_AW_PROMPT_09d424fe223a26d4_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/safe_outputs_create_pull_request.md" - cat << 'GH_AW_PROMPT_d68ebdb73563ac48_EOF' + cat << 'GH_AW_PROMPT_09d424fe223a26d4_EOF' The following GitHub context information is available for this workflow: @@ -252,9 +250,9 @@ jobs: {{/if}} - GH_AW_PROMPT_d68ebdb73563ac48_EOF + GH_AW_PROMPT_09d424fe223a26d4_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/github_mcp_tools_with_safeoutputs_prompt.md" - cat << 'GH_AW_PROMPT_d68ebdb73563ac48_EOF' + cat << 'GH_AW_PROMPT_09d424fe223a26d4_EOF' {{#runtime-import .github/prompts/00-base-contract.md}} {{#runtime-import .github/prompts/01-bash-and-shell-safety.md}} @@ -266,7 +264,7 @@ jobs: {{#runtime-import .github/prompts/07-commit-and-pr.md}} {{#runtime-import .github/prompts/ext/tier-c-aggregation.md}} {{#runtime-import .github/workflows/news-month-ahead.md}} - GH_AW_PROMPT_d68ebdb73563ac48_EOF + GH_AW_PROMPT_09d424fe223a26d4_EOF } > "$GH_AW_PROMPT" - name: Interpolate variables and render templates uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 @@ -290,12 +288,6 @@ jobs: GH_AW_GITHUB_REPOSITORY: ${{ github.repository }} GH_AW_GITHUB_RUN_ID: ${{ github.run_id }} GH_AW_GITHUB_WORKSPACE: ${{ github.workspace }} - GH_AW_MEMORY_BRANCH_NAME: 'memory/news-generation' - GH_AW_MEMORY_CONSTRAINTS: "\n\n**Constraints:**\n- **Max File Size**: 51200 bytes (0.05 MB) per file\n- **Max File Count**: 50 files per commit\n- **Max Patch Size**: 51200 bytes (50 KB) total per push (max: 100 KB)\n" - GH_AW_MEMORY_DESCRIPTION: '' - GH_AW_MEMORY_DIR: '/tmp/gh-aw/repo-memory/default/' - GH_AW_MEMORY_TARGET_REPO: ' of the current repository' - GH_AW_WIKI_NOTE: '' with: script: | const { setupGlobals } = require('${{ runner.temp }}/gh-aw/actions/setup_globals.cjs'); @@ -314,13 +306,7 @@ jobs: GH_AW_GITHUB_EVENT_PULL_REQUEST_NUMBER: process.env.GH_AW_GITHUB_EVENT_PULL_REQUEST_NUMBER, GH_AW_GITHUB_REPOSITORY: process.env.GH_AW_GITHUB_REPOSITORY, GH_AW_GITHUB_RUN_ID: process.env.GH_AW_GITHUB_RUN_ID, - GH_AW_GITHUB_WORKSPACE: process.env.GH_AW_GITHUB_WORKSPACE, - GH_AW_MEMORY_BRANCH_NAME: process.env.GH_AW_MEMORY_BRANCH_NAME, - GH_AW_MEMORY_CONSTRAINTS: process.env.GH_AW_MEMORY_CONSTRAINTS, - GH_AW_MEMORY_DESCRIPTION: process.env.GH_AW_MEMORY_DESCRIPTION, - GH_AW_MEMORY_DIR: process.env.GH_AW_MEMORY_DIR, - GH_AW_MEMORY_TARGET_REPO: process.env.GH_AW_MEMORY_TARGET_REPO, - GH_AW_WIKI_NOTE: process.env.GH_AW_WIKI_NOTE + GH_AW_GITHUB_WORKSPACE: process.env.GH_AW_GITHUB_WORKSPACE } }); - name: Validate prompt placeholders @@ -415,16 +401,6 @@ jobs: - name: Pre-flight external endpoint reachability check (runs before MCP Gateway) run: "echo \"🔍 Network Diagnostics — $(date -u '+%Y-%m-%dT%H:%M:%SZ')\"\necho \"═══════════════════════════════════════════\"\necho \"\"\necho \"📡 DNS Resolution Tests:\"\nfor domain in riksdag-regering-ai.onrender.com api.scb.se api.worldbank.org data.riksdagen.se www.riksdagen.se www.regeringen.se; do\n if nslookup \"$domain\" >/dev/null 2>&1; then\n IP=$(nslookup \"$domain\" 2>/dev/null | grep -A1 \"Name:\" | grep \"Address:\" | head -1 | awk '{print $2}')\n echo \" ✅ $domain → $IP\"\n else\n echo \" ❌ $domain — DNS FAILED\"\n fi\ndone\necho \"\"\necho \"🌐 HTTPS Connectivity Tests:\"\nfor url in \\\n \"https://riksdag-regering-ai.onrender.com/mcp\" \\\n \"https://api.scb.se/OV0104/v2beta\" \\\n \"https://api.worldbank.org/v2/country/SE?format=json\" \\\n \"https://data.riksdagen.se/dokumentlista/?sok=test&doktyp=bet&utformat=json&a=1\" \\\n; do\n HTTP_CODE=$(curl -s -o /dev/null -w \"%{http_code}\" --max-time 10 \"$url\" 2>/dev/null || echo \"000\")\n DOMAIN=$(echo \"$url\" | sed 's|https://||' | cut -d/ -f1)\n if [ \"$HTTP_CODE\" -ge 200 ] && [ \"$HTTP_CODE\" -lt 400 ]; then\n echo \" ✅ $DOMAIN → HTTP $HTTP_CODE\"\n elif [ \"$HTTP_CODE\" = \"000\" ]; then\n echo \" ❌ $DOMAIN → TIMEOUT/UNREACHABLE\"\n else\n echo \" ⚠️ $DOMAIN → HTTP $HTTP_CODE\"\n fi\ndone\necho \"\"\necho \"🔌 MCP Server Tool Count:\"\nTOOL_RESP=$(curl -sf --max-time 15 -X POST \\\n -H \"Content-Type: application/json\" \\\n -d '{\"jsonrpc\":\"2.0\",\"id\":1,\"method\":\"tools/list\",\"params\":{}}' \\\n \"https://riksdag-regering-ai.onrender.com/mcp\" 2>/dev/null) || TOOL_RESP=\"\"\nif echo \"$TOOL_RESP\" | grep -q '\"tools\"'; then\n TOOL_COUNT=$(echo \"$TOOL_RESP\" | grep -o '\"name\"' | wc -l)\n echo \" ✅ riksdag-regering MCP: $TOOL_COUNT tools registered\"\nelse\n echo \" ❌ riksdag-regering MCP: No tools response (server may still be starting)\"\nfi\necho \"\"\necho \"═══════════════════════════════════════════\"\n" - # Repo memory git-based storage configuration from frontmatter processed below - - name: Clone repo-memory branch (default) - env: - GH_TOKEN: ${{ github.token }} - GITHUB_SERVER_URL: ${{ github.server_url }} - BRANCH_NAME: memory/news-generation - TARGET_REPO: ${{ github.repository }} - MEMORY_DIR: /tmp/gh-aw/repo-memory/default - CREATE_ORPHAN: true - run: bash "${RUNNER_TEMP}/gh-aw/actions/clone_repo_memory_branch.sh" - name: Configure Git credentials env: REPO_NAME: ${{ github.repository }} @@ -499,9 +475,9 @@ jobs: mkdir -p "${RUNNER_TEMP}/gh-aw/safeoutputs" mkdir -p /tmp/gh-aw/safeoutputs mkdir -p /tmp/gh-aw/mcp-logs/safeoutputs - cat > "${RUNNER_TEMP}/gh-aw/safeoutputs/config.json" << 'GH_AW_SAFE_OUTPUTS_CONFIG_ff7f4402944ea0d2_EOF' - {"add_comment":{"max":1},"create_pull_request":{"draft":false,"expires":336,"labels":["agentic-news","analysis-data"],"max":1,"max_patch_size":4096,"protected_files":["package.json","bun.lockb","bunfig.toml","deno.json","deno.jsonc","deno.lock","global.json","NuGet.Config","Directory.Packages.props","mix.exs","mix.lock","go.mod","go.sum","stack.yaml","stack.yaml.lock","pom.xml","build.gradle","build.gradle.kts","settings.gradle","settings.gradle.kts","gradle.properties","package-lock.json","yarn.lock","pnpm-lock.yaml","npm-shrinkwrap.json","requirements.txt","Pipfile","Pipfile.lock","pyproject.toml","setup.py","setup.cfg","Gemfile","Gemfile.lock","uv.lock","CODEOWNERS","AGENTS.md","CLAUDE.md","GEMINI.md"],"protected_path_prefixes":[".github/",".agents/"]},"create_report_incomplete_issue":{},"dispatch_workflow":{"max":1,"workflow_files":{"news-translate":".lock.yml"},"workflows":["news-translate"]},"missing_data":{},"missing_tool":{},"noop":{"max":1,"report-as-issue":"true"},"push_repo_memory":{"memories":[{"dir":"/tmp/gh-aw/repo-memory/default","id":"default","max_file_count":50,"max_file_size":51200,"max_patch_size":51200}]},"report_incomplete":{}} - GH_AW_SAFE_OUTPUTS_CONFIG_ff7f4402944ea0d2_EOF + cat > "${RUNNER_TEMP}/gh-aw/safeoutputs/config.json" << 'GH_AW_SAFE_OUTPUTS_CONFIG_6340262f577529e8_EOF' + {"add_comment":{"max":1},"create_pull_request":{"draft":false,"expires":336,"labels":["agentic-news","analysis-data"],"max":1,"max_patch_size":4096,"protected_files":["package.json","bun.lockb","bunfig.toml","deno.json","deno.jsonc","deno.lock","global.json","NuGet.Config","Directory.Packages.props","mix.exs","mix.lock","go.mod","go.sum","stack.yaml","stack.yaml.lock","pom.xml","build.gradle","build.gradle.kts","settings.gradle","settings.gradle.kts","gradle.properties","package-lock.json","yarn.lock","pnpm-lock.yaml","npm-shrinkwrap.json","requirements.txt","Pipfile","Pipfile.lock","pyproject.toml","setup.py","setup.cfg","Gemfile","Gemfile.lock","uv.lock","CODEOWNERS","AGENTS.md","CLAUDE.md","GEMINI.md"],"protected_path_prefixes":[".github/",".agents/"]},"create_report_incomplete_issue":{},"dispatch_workflow":{"max":1,"workflow_files":{"news-translate":".lock.yml"},"workflows":["news-translate"]},"missing_data":{},"missing_tool":{},"noop":{"max":1,"report-as-issue":"true"},"report_incomplete":{}} + GH_AW_SAFE_OUTPUTS_CONFIG_6340262f577529e8_EOF - name: Write Safe Outputs Tools env: GH_AW_TOOLS_META_JSON: | @@ -767,7 +743,7 @@ jobs: mkdir -p /home/runner/.copilot GH_AW_NODE=$(which node 2>/dev/null || command -v node 2>/dev/null || echo node) - cat << GH_AW_MCP_CONFIG_d1f05584a39f813d_EOF | "$GH_AW_NODE" "${RUNNER_TEMP}/gh-aw/actions/start_mcp_gateway.cjs" + cat << GH_AW_MCP_CONFIG_dbb80b0313d0fd27_EOF | "$GH_AW_NODE" "${RUNNER_TEMP}/gh-aw/actions/start_mcp_gateway.cjs" { "mcpServers": { "agenticworkflows": { @@ -880,10 +856,11 @@ jobs: "port": $MCP_GATEWAY_PORT, "domain": "${MCP_GATEWAY_DOMAIN}", "apiKey": "${MCP_GATEWAY_API_KEY}", - "payloadDir": "${MCP_GATEWAY_PAYLOAD_DIR}" + "payloadDir": "${MCP_GATEWAY_PAYLOAD_DIR}", + "keepaliveInterval": 300 } } - GH_AW_MCP_CONFIG_d1f05584a39f813d_EOF + GH_AW_MCP_CONFIG_dbb80b0313d0fd27_EOF - name: Download activation artifact uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1 with: @@ -901,7 +878,7 @@ jobs: - name: Execute GitHub Copilot CLI id: agentic_execution # Copilot CLI tool arguments (sorted): - timeout-minutes: 60 + timeout-minutes: 55 run: | set -o pipefail touch /tmp/gh-aw/agent-step-summary.md @@ -1059,15 +1036,6 @@ jobs: if [ ! -f /tmp/gh-aw/agent_output.json ]; then echo '{"items":[]}' > /tmp/gh-aw/agent_output.json fi - # Upload repo memory as artifacts for push job - - name: Upload repo-memory artifact (default) - if: always() - uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1 - with: - name: repo-memory-default - path: /tmp/gh-aw/repo-memory/default - retention-days: 1 - if-no-files-found: ignore - name: Upload agent artifacts if: always() continue-on-error: true @@ -1096,7 +1064,6 @@ jobs: - activation - agent - detection - - push_repo_memory - safe_outputs if: > always() && (needs.agent.result != 'skipped' || needs.activation.outputs.lockdown_check_failed == 'true' || @@ -1221,13 +1188,9 @@ jobs: GH_AW_CODE_PUSH_FAILURE_COUNT: ${{ needs.safe_outputs.outputs.code_push_failure_count }} GH_AW_LOCKDOWN_CHECK_FAILED: ${{ needs.activation.outputs.lockdown_check_failed }} GH_AW_STALE_LOCK_FILE_FAILED: ${{ needs.activation.outputs.stale_lock_file_failed }} - GH_AW_PUSH_REPO_MEMORY_RESULT: ${{ needs.push_repo_memory.result }} - GH_AW_REPO_MEMORY_VALIDATION_FAILED_default: ${{ needs.push_repo_memory.outputs.validation_failed_default }} - GH_AW_REPO_MEMORY_VALIDATION_ERROR_default: ${{ needs.push_repo_memory.outputs.validation_error_default }} - GH_AW_REPO_MEMORY_PATCH_SIZE_EXCEEDED_default: ${{ needs.push_repo_memory.outputs.patch_size_exceeded_default }} GH_AW_GROUP_REPORTS: "false" GH_AW_FAILURE_REPORT_AS_ISSUE: "true" - GH_AW_TIMEOUT_MINUTES: "60" + GH_AW_TIMEOUT_MINUTES: "55" with: github-token: ${{ secrets.GH_AW_GITHUB_TOKEN || secrets.GITHUB_TOKEN }} script: | @@ -1396,79 +1359,6 @@ jobs: const { main } = require('${{ runner.temp }}/gh-aw/actions/parse_threat_detection_results.cjs'); await main(); - push_repo_memory: - needs: - - activation - - agent - - detection - if: > - always() && (!cancelled()) && (needs.detection.result == 'success' || needs.detection.result == 'skipped') && - needs.agent.result != 'skipped' - runs-on: ubuntu-slim - permissions: - contents: write - concurrency: - group: "push-repo-memory-${{ github.repository }}|memory/news-generation" - cancel-in-progress: false - outputs: - patch_size_exceeded_default: ${{ steps.push_repo_memory_default.outputs.patch_size_exceeded }} - validation_error_default: ${{ steps.push_repo_memory_default.outputs.validation_error }} - validation_failed_default: ${{ steps.push_repo_memory_default.outputs.validation_failed }} - steps: - - name: Setup Scripts - id: setup - uses: github/gh-aw-actions/setup@006ffd856b868b71df342dbe0ba082a963249b31 # v0.69.3 - with: - destination: ${{ runner.temp }}/gh-aw/actions - job-name: ${{ github.job }} - trace-id: ${{ needs.activation.outputs.setup-trace-id }} - - name: Checkout repository - uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2 - with: - persist-credentials: false - sparse-checkout: . - - name: Configure Git credentials - env: - REPO_NAME: ${{ github.repository }} - SERVER_URL: ${{ github.server_url }} - GITHUB_TOKEN: ${{ github.token }} - run: | - git config --global user.email "github-actions[bot]@users.noreply.github.com" - git config --global user.name "github-actions[bot]" - git config --global am.keepcr true - # Re-authenticate git with GitHub token - SERVER_URL_STRIPPED="${SERVER_URL#https://}" - git remote set-url origin "https://x-access-token:${GITHUB_TOKEN}@${SERVER_URL_STRIPPED}/${REPO_NAME}.git" - echo "Git configured with standard GitHub Actions identity" - - name: Download repo-memory artifact (default) - uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1 - continue-on-error: true - with: - name: repo-memory-default - path: /tmp/gh-aw/repo-memory/default - - name: Push repo-memory changes (default) - id: push_repo_memory_default - if: always() - uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 - env: - GH_TOKEN: ${{ github.token }} - GITHUB_RUN_ID: ${{ github.run_id }} - GITHUB_SERVER_URL: ${{ github.server_url }} - ARTIFACT_DIR: /tmp/gh-aw/repo-memory/default - MEMORY_ID: default - TARGET_REPO: ${{ github.repository }} - BRANCH_NAME: memory/news-generation - MAX_FILE_SIZE: 51200 - MAX_FILE_COUNT: 50 - MAX_PATCH_SIZE: 51200 - ALLOWED_EXTENSIONS: '[".md",".json"]' - with: - script: | - const { setupGlobals } = require('${{ runner.temp }}/gh-aw/actions/setup_globals.cjs'); - setupGlobals(core, github, context, exec, io, getOctokit); - const { main } = require('${{ runner.temp }}/gh-aw/actions/push_repo_memory.cjs'); - await main(); - safe_outputs: needs: - activation diff --git a/.github/workflows/news-month-ahead.md b/.github/workflows/news-month-ahead.md index 10a3b9779..317cc53a9 100644 --- a/.github/workflows/news-month-ahead.md +++ b/.github/workflows/news-month-ahead.md @@ -42,12 +42,21 @@ permissions: discussions: read security-events: read -timeout-minutes: 60 +timeout-minutes: 55 concurrency: group: gh-aw-news-month-ahead-${{ inputs.article_date || 'today' }} cancel-in-progress: false +features: + mcp-gateway: true + +sandbox: + agent: awf + mcp: + port: 8080 + keepalive-interval: 300 # 5m ping to keep MCP connections alive; Copilot API token expires ~60min so PR must be created within 25min of agent start + runtimes: node: version: "25" @@ -99,12 +108,6 @@ tools: - all agentic-workflows: true bash: true - repo-memory: - branch-name: memory/news-generation - allowed-extensions: [".md", ".json"] - max-file-size: 51200 - max-file-count: 50 - max-patch-size: 51200 safe-outputs: allowed-domains: @@ -234,38 +237,47 @@ This workflow imports `../prompts/ext/tier-c-aggregation.md`. Produce **all 14 a - **Article type**: `month-ahead` - **Analysis subfolder**: `analysis/daily/$ARTICLE_DATE/month-ahead/` - **Core languages produced**: `en`, `sv` (remaining 12 languages dispatched to `news-translate`) -- **One pull request per run** containing analysis + articles + visualisation data. +- **Two-run model**: Run 1 produces an `analysis-only` PR (14 artifacts); Run 2 (next scheduled run, same day) detects existing analysis and produces an articles PR. + +## Time budget + +**Run 1 — Analysis mode** (no prior analysis found, ~45 min): + +| Minutes | Phase | Module | +|---------|-------|--------| +| 0–2 | MCP pre-warm + pre-flight analysis check | 02 / 03 | +| 2–7 | Download data + catalogue | 03 | +| 7–27 | Analysis Pass 1 (methodology read + per-doc analyses + 14 artifacts incl. 5 Tier-C) | 04 / ext | +| 27–38 | Analysis Pass 2 (read-back + improvements) | 04 | +| 38–41 | Analysis Gate (Tier-C extended gate) | 05 | +| 41–45 | Stage analysis, commit, **ONE** `safeoutputs___create_pull_request` (analysis-only) | 07 | -## Time budget (60 min, minimum 45 min of real work) +**Run 2 — Article mode** (analysis exists on disk, ~25 min): | Minutes | Phase | Module | |---------|-------|--------| -| 0–2 | MCP pre-warm + `get_sync_status` | 02 | -| 2–6 | Download data + catalogue | 03 | -| 6–25 | Analysis Pass 1 (methodology read + per-doc analyses + 9 artifacts) | 04 | -| 25–35 | Analysis Pass 2 (read-back + improvements) | 04 | -| 35–37 | Analysis Gate | 05 | -| 37–48 | Article Pass 1 + Pass 2 (EN, SV) | 06 | -| 48–55 | Visual + link validation | 06 | -| 55–60 | Stage, commit, **ONE** `safeoutputs___create_pull_request` | 07 | +| 0–2 | MCP pre-warm + pre-flight check (SKIP_ANALYSIS=true) | 02 / 03 | +| 2–5 | Read all 14 analysis artifacts into context | 06 | +| 5–18 | Article Pass 1 + Pass 2 (EN, SV) | 06 | +| 18–22 | Visual + link validation | 06 | +| 22–25 | Stage articles, commit, **ONE** `safeoutputs___create_pull_request` | 07 | -Trim scope before quality. Never open a second PR to "save" partial work — there is no second PR. +Trim scope before quality. Never open a second PR within a run — there is no second PR. ## Inputs - `article_date` — override date (defaults to today) -- `force_generation` — regenerate even if today's article exists (analysis is always refreshed regardless) +- `force_generation` — regenerate even if today's article exists; also forces analysis re-run - `languages` — core content languages (default `en,sv`) - `analysis_depth` — `standard` | `deep` (default) | `comprehensive` -## Dedup & analysis-only path +## Run-mode selection -If articles for `$ARTICLE_DATE` + `month-ahead` already exist **and** `force_generation=false`: +At the start of every run, the pre-flight check in `03-data-download.md` detects whether `analysis/daily/$ARTICLE_DATE/month-ahead/` already contains all 9 core artifacts: -- Still run the full analysis pipeline (modules 03 → 04 → 05). -- Commit the analysis. -- Open the single PR with title `📊 Analysis Only — Month Ahead — $ARTICLE_DATE` and label `analysis-only`. +- **No analysis found** → Analysis mode: download data, run Pass 1 + Pass 2 + Tier-C Gate (14 artifacts), commit analysis artifacts, open `analysis-only` PR, stop. +- **Analysis found** → Article mode: read existing analysis, generate articles, commit articles, open articles PR + dispatch `news-translate`. -Analysis is the primary product — a run never "does nothing" just because articles exist. +Repeated runs for the same `$ARTICLE_DATE` always use the same analysis folder when `force_generation=false`. Analysis is the primary product — a run never produces nothing. All other rules (bash format, AWF shell safety, MCP access, download pipeline, analysis methodology & gate, article generation, commit & PR policy) live in the imported modules. diff --git a/.github/workflows/news-monthly-review.lock.yml b/.github/workflows/news-monthly-review.lock.yml index f0f915e64..72b558ace 100644 --- a/.github/workflows/news-monthly-review.lock.yml +++ b/.github/workflows/news-monthly-review.lock.yml @@ -1,4 +1,4 @@ -# gh-aw-metadata: {"schema_version":"v3","frontmatter_hash":"ee594adff70b242159d9488e1d78e721087832bc47d821be24c08a81ac3e3c9f","compiler_version":"v0.69.3","agent_id":"copilot","agent_model":"claude-opus-4.7"} +# gh-aw-metadata: {"schema_version":"v3","frontmatter_hash":"e16207ca6e49087974fe7d85464b9b122e6a1b0aa72db7291db8bc40a661f36e","compiler_version":"v0.69.3","agent_id":"copilot","agent_model":"claude-opus-4.7"} # gh-aw-manifest: {"version":1,"secrets":["COPILOT_GITHUB_TOKEN","GH_AW_CI_TRIGGER_TOKEN","GH_AW_GITHUB_MCP_SERVER_TOKEN","GH_AW_GITHUB_TOKEN","GITHUB_TOKEN"],"actions":[{"repo":"actions/checkout","sha":"de0fac2e4500dabe0009e67214ff5f5447ce83dd","version":"v6.0.2"},{"repo":"actions/download-artifact","sha":"3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c","version":"v8.0.1"},{"repo":"actions/github-script","sha":"373c709c69115d41ff229c7e5df9f8788daa9553","version":"v9"},{"repo":"actions/setup-node","sha":"6044e13b5dc448c55e2357c09f80417699197238","version":"6044e13b5dc448c55e2357c09f80417699197238"},{"repo":"actions/upload-artifact","sha":"043fb46d1a93c77aae656e7c1c64a875d1fc6a0a","version":"v7.0.1"},{"repo":"github/gh-aw-actions/setup","sha":"006ffd856b868b71df342dbe0ba082a963249b31","version":"v0.69.3"}],"containers":[{"image":"alpine:latest"},{"image":"ghcr.io/github/gh-aw-firewall/agent:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/api-proxy:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/squid:0.25.26"},{"image":"ghcr.io/github/gh-aw-mcpg:v0.2.26"},{"image":"ghcr.io/github/github-mcp-server:v1.0.0"},{"image":"node:25-alpine"},{"image":"node:lts-alpine"}]} # ___ _ _ # / _ \ | | (_) @@ -203,26 +203,24 @@ jobs: GH_AW_GITHUB_REPOSITORY: ${{ github.repository }} GH_AW_GITHUB_RUN_ID: ${{ github.run_id }} GH_AW_GITHUB_WORKSPACE: ${{ github.workspace }} - GH_AW_WIKI_NOTE: ${{ '' }} # poutine:ignore untrusted_checkout_exec run: | bash "${RUNNER_TEMP}/gh-aw/actions/create_prompt_first.sh" { - cat << 'GH_AW_PROMPT_a84a4a7eb3e6b141_EOF' + cat << 'GH_AW_PROMPT_2bca7b7da5338499_EOF' - GH_AW_PROMPT_a84a4a7eb3e6b141_EOF + GH_AW_PROMPT_2bca7b7da5338499_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/xpia.md" cat "${RUNNER_TEMP}/gh-aw/prompts/temp_folder_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/markdown.md" cat "${RUNNER_TEMP}/gh-aw/prompts/agentic_workflows_guide.md" - cat "${RUNNER_TEMP}/gh-aw/prompts/repo_memory_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/safe_outputs_prompt.md" - cat << 'GH_AW_PROMPT_a84a4a7eb3e6b141_EOF' + cat << 'GH_AW_PROMPT_2bca7b7da5338499_EOF' Tools: add_comment, create_pull_request, dispatch_workflow, missing_tool, missing_data, noop - GH_AW_PROMPT_a84a4a7eb3e6b141_EOF + GH_AW_PROMPT_2bca7b7da5338499_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/safe_outputs_create_pull_request.md" - cat << 'GH_AW_PROMPT_a84a4a7eb3e6b141_EOF' + cat << 'GH_AW_PROMPT_2bca7b7da5338499_EOF' The following GitHub context information is available for this workflow: @@ -252,9 +250,9 @@ jobs: {{/if}} - GH_AW_PROMPT_a84a4a7eb3e6b141_EOF + GH_AW_PROMPT_2bca7b7da5338499_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/github_mcp_tools_with_safeoutputs_prompt.md" - cat << 'GH_AW_PROMPT_a84a4a7eb3e6b141_EOF' + cat << 'GH_AW_PROMPT_2bca7b7da5338499_EOF' {{#runtime-import .github/prompts/00-base-contract.md}} {{#runtime-import .github/prompts/01-bash-and-shell-safety.md}} @@ -266,7 +264,7 @@ jobs: {{#runtime-import .github/prompts/07-commit-and-pr.md}} {{#runtime-import .github/prompts/ext/tier-c-aggregation.md}} {{#runtime-import .github/workflows/news-monthly-review.md}} - GH_AW_PROMPT_a84a4a7eb3e6b141_EOF + GH_AW_PROMPT_2bca7b7da5338499_EOF } > "$GH_AW_PROMPT" - name: Interpolate variables and render templates uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 @@ -290,12 +288,6 @@ jobs: GH_AW_GITHUB_REPOSITORY: ${{ github.repository }} GH_AW_GITHUB_RUN_ID: ${{ github.run_id }} GH_AW_GITHUB_WORKSPACE: ${{ github.workspace }} - GH_AW_MEMORY_BRANCH_NAME: 'memory/news-generation' - GH_AW_MEMORY_CONSTRAINTS: "\n\n**Constraints:**\n- **Max File Size**: 51200 bytes (0.05 MB) per file\n- **Max File Count**: 50 files per commit\n- **Max Patch Size**: 51200 bytes (50 KB) total per push (max: 100 KB)\n" - GH_AW_MEMORY_DESCRIPTION: '' - GH_AW_MEMORY_DIR: '/tmp/gh-aw/repo-memory/default/' - GH_AW_MEMORY_TARGET_REPO: ' of the current repository' - GH_AW_WIKI_NOTE: '' with: script: | const { setupGlobals } = require('${{ runner.temp }}/gh-aw/actions/setup_globals.cjs'); @@ -314,13 +306,7 @@ jobs: GH_AW_GITHUB_EVENT_PULL_REQUEST_NUMBER: process.env.GH_AW_GITHUB_EVENT_PULL_REQUEST_NUMBER, GH_AW_GITHUB_REPOSITORY: process.env.GH_AW_GITHUB_REPOSITORY, GH_AW_GITHUB_RUN_ID: process.env.GH_AW_GITHUB_RUN_ID, - GH_AW_GITHUB_WORKSPACE: process.env.GH_AW_GITHUB_WORKSPACE, - GH_AW_MEMORY_BRANCH_NAME: process.env.GH_AW_MEMORY_BRANCH_NAME, - GH_AW_MEMORY_CONSTRAINTS: process.env.GH_AW_MEMORY_CONSTRAINTS, - GH_AW_MEMORY_DESCRIPTION: process.env.GH_AW_MEMORY_DESCRIPTION, - GH_AW_MEMORY_DIR: process.env.GH_AW_MEMORY_DIR, - GH_AW_MEMORY_TARGET_REPO: process.env.GH_AW_MEMORY_TARGET_REPO, - GH_AW_WIKI_NOTE: process.env.GH_AW_WIKI_NOTE + GH_AW_GITHUB_WORKSPACE: process.env.GH_AW_GITHUB_WORKSPACE } }); - name: Validate prompt placeholders @@ -415,16 +401,6 @@ jobs: - name: Pre-flight external endpoint reachability check (runs before MCP Gateway) run: "echo \"🔍 Network Diagnostics — $(date -u '+%Y-%m-%dT%H:%M:%SZ')\"\necho \"═══════════════════════════════════════════\"\necho \"\"\necho \"📡 DNS Resolution Tests:\"\nfor domain in riksdag-regering-ai.onrender.com api.scb.se api.worldbank.org data.riksdagen.se www.riksdagen.se www.regeringen.se; do\n if nslookup \"$domain\" >/dev/null 2>&1; then\n IP=$(nslookup \"$domain\" 2>/dev/null | grep -A1 \"Name:\" | grep \"Address:\" | head -1 | awk '{print $2}')\n echo \" ✅ $domain → $IP\"\n else\n echo \" ❌ $domain — DNS FAILED\"\n fi\ndone\necho \"\"\necho \"🌐 HTTPS Connectivity Tests:\"\nfor url in \\\n \"https://riksdag-regering-ai.onrender.com/mcp\" \\\n \"https://api.scb.se/OV0104/v2beta\" \\\n \"https://api.worldbank.org/v2/country/SE?format=json\" \\\n \"https://data.riksdagen.se/dokumentlista/?sok=test&doktyp=bet&utformat=json&a=1\" \\\n; do\n HTTP_CODE=$(curl -s -o /dev/null -w \"%{http_code}\" --max-time 10 \"$url\" 2>/dev/null || echo \"000\")\n DOMAIN=$(echo \"$url\" | sed 's|https://||' | cut -d/ -f1)\n if [ \"$HTTP_CODE\" -ge 200 ] && [ \"$HTTP_CODE\" -lt 400 ]; then\n echo \" ✅ $DOMAIN → HTTP $HTTP_CODE\"\n elif [ \"$HTTP_CODE\" = \"000\" ]; then\n echo \" ❌ $DOMAIN → TIMEOUT/UNREACHABLE\"\n else\n echo \" ⚠️ $DOMAIN → HTTP $HTTP_CODE\"\n fi\ndone\necho \"\"\necho \"🔌 MCP Server Tool Count:\"\nTOOL_RESP=$(curl -sf --max-time 15 -X POST \\\n -H \"Content-Type: application/json\" \\\n -d '{\"jsonrpc\":\"2.0\",\"id\":1,\"method\":\"tools/list\",\"params\":{}}' \\\n \"https://riksdag-regering-ai.onrender.com/mcp\" 2>/dev/null) || TOOL_RESP=\"\"\nif echo \"$TOOL_RESP\" | grep -q '\"tools\"'; then\n TOOL_COUNT=$(echo \"$TOOL_RESP\" | grep -o '\"name\"' | wc -l)\n echo \" ✅ riksdag-regering MCP: $TOOL_COUNT tools registered\"\nelse\n echo \" ❌ riksdag-regering MCP: No tools response (server may still be starting)\"\nfi\necho \"\"\necho \"═══════════════════════════════════════════\"\n" - # Repo memory git-based storage configuration from frontmatter processed below - - name: Clone repo-memory branch (default) - env: - GH_TOKEN: ${{ github.token }} - GITHUB_SERVER_URL: ${{ github.server_url }} - BRANCH_NAME: memory/news-generation - TARGET_REPO: ${{ github.repository }} - MEMORY_DIR: /tmp/gh-aw/repo-memory/default - CREATE_ORPHAN: true - run: bash "${RUNNER_TEMP}/gh-aw/actions/clone_repo_memory_branch.sh" - name: Configure Git credentials env: REPO_NAME: ${{ github.repository }} @@ -499,9 +475,9 @@ jobs: mkdir -p "${RUNNER_TEMP}/gh-aw/safeoutputs" mkdir -p /tmp/gh-aw/safeoutputs mkdir -p /tmp/gh-aw/mcp-logs/safeoutputs - cat > "${RUNNER_TEMP}/gh-aw/safeoutputs/config.json" << 'GH_AW_SAFE_OUTPUTS_CONFIG_c9e7590b58d44a81_EOF' - {"add_comment":{"max":1},"create_pull_request":{"draft":false,"expires":336,"labels":["agentic-news","analysis-data"],"max":1,"max_patch_size":4096,"protected_files":["package.json","bun.lockb","bunfig.toml","deno.json","deno.jsonc","deno.lock","global.json","NuGet.Config","Directory.Packages.props","mix.exs","mix.lock","go.mod","go.sum","stack.yaml","stack.yaml.lock","pom.xml","build.gradle","build.gradle.kts","settings.gradle","settings.gradle.kts","gradle.properties","package-lock.json","yarn.lock","pnpm-lock.yaml","npm-shrinkwrap.json","requirements.txt","Pipfile","Pipfile.lock","pyproject.toml","setup.py","setup.cfg","Gemfile","Gemfile.lock","uv.lock","CODEOWNERS","AGENTS.md","CLAUDE.md","GEMINI.md"],"protected_path_prefixes":[".github/",".agents/"]},"create_report_incomplete_issue":{},"dispatch_workflow":{"max":1,"workflow_files":{"news-translate":".lock.yml"},"workflows":["news-translate"]},"missing_data":{},"missing_tool":{},"noop":{"max":1,"report-as-issue":"true"},"push_repo_memory":{"memories":[{"dir":"/tmp/gh-aw/repo-memory/default","id":"default","max_file_count":50,"max_file_size":51200,"max_patch_size":51200}]},"report_incomplete":{}} - GH_AW_SAFE_OUTPUTS_CONFIG_c9e7590b58d44a81_EOF + cat > "${RUNNER_TEMP}/gh-aw/safeoutputs/config.json" << 'GH_AW_SAFE_OUTPUTS_CONFIG_22ea4c03b0dc6ab3_EOF' + {"add_comment":{"max":1},"create_pull_request":{"draft":false,"expires":336,"labels":["agentic-news","analysis-data"],"max":1,"max_patch_size":4096,"protected_files":["package.json","bun.lockb","bunfig.toml","deno.json","deno.jsonc","deno.lock","global.json","NuGet.Config","Directory.Packages.props","mix.exs","mix.lock","go.mod","go.sum","stack.yaml","stack.yaml.lock","pom.xml","build.gradle","build.gradle.kts","settings.gradle","settings.gradle.kts","gradle.properties","package-lock.json","yarn.lock","pnpm-lock.yaml","npm-shrinkwrap.json","requirements.txt","Pipfile","Pipfile.lock","pyproject.toml","setup.py","setup.cfg","Gemfile","Gemfile.lock","uv.lock","CODEOWNERS","AGENTS.md","CLAUDE.md","GEMINI.md"],"protected_path_prefixes":[".github/",".agents/"]},"create_report_incomplete_issue":{},"dispatch_workflow":{"max":1,"workflow_files":{"news-translate":".lock.yml"},"workflows":["news-translate"]},"missing_data":{},"missing_tool":{},"noop":{"max":1,"report-as-issue":"true"},"report_incomplete":{}} + GH_AW_SAFE_OUTPUTS_CONFIG_22ea4c03b0dc6ab3_EOF - name: Write Safe Outputs Tools env: GH_AW_TOOLS_META_JSON: | @@ -767,7 +743,7 @@ jobs: mkdir -p /home/runner/.copilot GH_AW_NODE=$(which node 2>/dev/null || command -v node 2>/dev/null || echo node) - cat << GH_AW_MCP_CONFIG_38eec37a1280f704_EOF | "$GH_AW_NODE" "${RUNNER_TEMP}/gh-aw/actions/start_mcp_gateway.cjs" + cat << GH_AW_MCP_CONFIG_6e0d8eee1b62eb66_EOF | "$GH_AW_NODE" "${RUNNER_TEMP}/gh-aw/actions/start_mcp_gateway.cjs" { "mcpServers": { "agenticworkflows": { @@ -880,10 +856,11 @@ jobs: "port": $MCP_GATEWAY_PORT, "domain": "${MCP_GATEWAY_DOMAIN}", "apiKey": "${MCP_GATEWAY_API_KEY}", - "payloadDir": "${MCP_GATEWAY_PAYLOAD_DIR}" + "payloadDir": "${MCP_GATEWAY_PAYLOAD_DIR}", + "keepaliveInterval": 300 } } - GH_AW_MCP_CONFIG_38eec37a1280f704_EOF + GH_AW_MCP_CONFIG_6e0d8eee1b62eb66_EOF - name: Download activation artifact uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1 with: @@ -901,7 +878,7 @@ jobs: - name: Execute GitHub Copilot CLI id: agentic_execution # Copilot CLI tool arguments (sorted): - timeout-minutes: 60 + timeout-minutes: 55 run: | set -o pipefail touch /tmp/gh-aw/agent-step-summary.md @@ -1059,15 +1036,6 @@ jobs: if [ ! -f /tmp/gh-aw/agent_output.json ]; then echo '{"items":[]}' > /tmp/gh-aw/agent_output.json fi - # Upload repo memory as artifacts for push job - - name: Upload repo-memory artifact (default) - if: always() - uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1 - with: - name: repo-memory-default - path: /tmp/gh-aw/repo-memory/default - retention-days: 1 - if-no-files-found: ignore - name: Upload agent artifacts if: always() continue-on-error: true @@ -1096,7 +1064,6 @@ jobs: - activation - agent - detection - - push_repo_memory - safe_outputs if: > always() && (needs.agent.result != 'skipped' || needs.activation.outputs.lockdown_check_failed == 'true' || @@ -1221,13 +1188,9 @@ jobs: GH_AW_CODE_PUSH_FAILURE_COUNT: ${{ needs.safe_outputs.outputs.code_push_failure_count }} GH_AW_LOCKDOWN_CHECK_FAILED: ${{ needs.activation.outputs.lockdown_check_failed }} GH_AW_STALE_LOCK_FILE_FAILED: ${{ needs.activation.outputs.stale_lock_file_failed }} - GH_AW_PUSH_REPO_MEMORY_RESULT: ${{ needs.push_repo_memory.result }} - GH_AW_REPO_MEMORY_VALIDATION_FAILED_default: ${{ needs.push_repo_memory.outputs.validation_failed_default }} - GH_AW_REPO_MEMORY_VALIDATION_ERROR_default: ${{ needs.push_repo_memory.outputs.validation_error_default }} - GH_AW_REPO_MEMORY_PATCH_SIZE_EXCEEDED_default: ${{ needs.push_repo_memory.outputs.patch_size_exceeded_default }} GH_AW_GROUP_REPORTS: "false" GH_AW_FAILURE_REPORT_AS_ISSUE: "true" - GH_AW_TIMEOUT_MINUTES: "60" + GH_AW_TIMEOUT_MINUTES: "55" with: github-token: ${{ secrets.GH_AW_GITHUB_TOKEN || secrets.GITHUB_TOKEN }} script: | @@ -1396,79 +1359,6 @@ jobs: const { main } = require('${{ runner.temp }}/gh-aw/actions/parse_threat_detection_results.cjs'); await main(); - push_repo_memory: - needs: - - activation - - agent - - detection - if: > - always() && (!cancelled()) && (needs.detection.result == 'success' || needs.detection.result == 'skipped') && - needs.agent.result != 'skipped' - runs-on: ubuntu-slim - permissions: - contents: write - concurrency: - group: "push-repo-memory-${{ github.repository }}|memory/news-generation" - cancel-in-progress: false - outputs: - patch_size_exceeded_default: ${{ steps.push_repo_memory_default.outputs.patch_size_exceeded }} - validation_error_default: ${{ steps.push_repo_memory_default.outputs.validation_error }} - validation_failed_default: ${{ steps.push_repo_memory_default.outputs.validation_failed }} - steps: - - name: Setup Scripts - id: setup - uses: github/gh-aw-actions/setup@006ffd856b868b71df342dbe0ba082a963249b31 # v0.69.3 - with: - destination: ${{ runner.temp }}/gh-aw/actions - job-name: ${{ github.job }} - trace-id: ${{ needs.activation.outputs.setup-trace-id }} - - name: Checkout repository - uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2 - with: - persist-credentials: false - sparse-checkout: . - - name: Configure Git credentials - env: - REPO_NAME: ${{ github.repository }} - SERVER_URL: ${{ github.server_url }} - GITHUB_TOKEN: ${{ github.token }} - run: | - git config --global user.email "github-actions[bot]@users.noreply.github.com" - git config --global user.name "github-actions[bot]" - git config --global am.keepcr true - # Re-authenticate git with GitHub token - SERVER_URL_STRIPPED="${SERVER_URL#https://}" - git remote set-url origin "https://x-access-token:${GITHUB_TOKEN}@${SERVER_URL_STRIPPED}/${REPO_NAME}.git" - echo "Git configured with standard GitHub Actions identity" - - name: Download repo-memory artifact (default) - uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1 - continue-on-error: true - with: - name: repo-memory-default - path: /tmp/gh-aw/repo-memory/default - - name: Push repo-memory changes (default) - id: push_repo_memory_default - if: always() - uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 - env: - GH_TOKEN: ${{ github.token }} - GITHUB_RUN_ID: ${{ github.run_id }} - GITHUB_SERVER_URL: ${{ github.server_url }} - ARTIFACT_DIR: /tmp/gh-aw/repo-memory/default - MEMORY_ID: default - TARGET_REPO: ${{ github.repository }} - BRANCH_NAME: memory/news-generation - MAX_FILE_SIZE: 51200 - MAX_FILE_COUNT: 50 - MAX_PATCH_SIZE: 51200 - ALLOWED_EXTENSIONS: '[".md",".json"]' - with: - script: | - const { setupGlobals } = require('${{ runner.temp }}/gh-aw/actions/setup_globals.cjs'); - setupGlobals(core, github, context, exec, io, getOctokit); - const { main } = require('${{ runner.temp }}/gh-aw/actions/push_repo_memory.cjs'); - await main(); - safe_outputs: needs: - activation diff --git a/.github/workflows/news-monthly-review.md b/.github/workflows/news-monthly-review.md index 6f9d89341..5e6210112 100644 --- a/.github/workflows/news-monthly-review.md +++ b/.github/workflows/news-monthly-review.md @@ -42,12 +42,21 @@ permissions: discussions: read security-events: read -timeout-minutes: 60 +timeout-minutes: 55 concurrency: group: gh-aw-news-monthly-review-${{ inputs.article_date || 'today' }} cancel-in-progress: false +features: + mcp-gateway: true + +sandbox: + agent: awf + mcp: + port: 8080 + keepalive-interval: 300 # 5m ping to keep MCP connections alive; Copilot API token expires ~60min so PR must be created within 25min of agent start + runtimes: node: version: "25" @@ -99,12 +108,6 @@ tools: - all agentic-workflows: true bash: true - repo-memory: - branch-name: memory/news-generation - allowed-extensions: [".md", ".json"] - max-file-size: 51200 - max-file-count: 50 - max-patch-size: 51200 safe-outputs: allowed-domains: @@ -234,38 +237,47 @@ This workflow imports `../prompts/ext/tier-c-aggregation.md`. Produce **all 14 a - **Article type**: `monthly-review` - **Analysis subfolder**: `analysis/daily/$ARTICLE_DATE/monthly-review/` - **Core languages produced**: `en`, `sv` (remaining 12 languages dispatched to `news-translate`) -- **One pull request per run** containing analysis + articles + visualisation data. +- **Two-run model**: Run 1 produces an `analysis-only` PR (14 artifacts); Run 2 (next scheduled run, same day) detects existing analysis and produces an articles PR. + +## Time budget + +**Run 1 — Analysis mode** (no prior analysis found, ~45 min): + +| Minutes | Phase | Module | +|---------|-------|--------| +| 0–2 | MCP pre-warm + pre-flight analysis check | 02 / 03 | +| 2–7 | Download data + catalogue | 03 | +| 7–27 | Analysis Pass 1 (methodology read + per-doc analyses + 14 artifacts incl. 5 Tier-C) | 04 / ext | +| 27–38 | Analysis Pass 2 (read-back + improvements) | 04 | +| 38–41 | Analysis Gate (Tier-C extended gate) | 05 | +| 41–45 | Stage analysis, commit, **ONE** `safeoutputs___create_pull_request` (analysis-only) | 07 | -## Time budget (60 min, minimum 45 min of real work) +**Run 2 — Article mode** (analysis exists on disk, ~25 min): | Minutes | Phase | Module | |---------|-------|--------| -| 0–2 | MCP pre-warm + `get_sync_status` | 02 | -| 2–6 | Download data + catalogue | 03 | -| 6–25 | Analysis Pass 1 (methodology read + per-doc analyses + 9 artifacts) | 04 | -| 25–35 | Analysis Pass 2 (read-back + improvements) | 04 | -| 35–37 | Analysis Gate | 05 | -| 37–48 | Article Pass 1 + Pass 2 (EN, SV) | 06 | -| 48–55 | Visual + link validation | 06 | -| 55–60 | Stage, commit, **ONE** `safeoutputs___create_pull_request` | 07 | +| 0–2 | MCP pre-warm + pre-flight check (SKIP_ANALYSIS=true) | 02 / 03 | +| 2–5 | Read all 14 analysis artifacts into context | 06 | +| 5–18 | Article Pass 1 + Pass 2 (EN, SV) | 06 | +| 18–22 | Visual + link validation | 06 | +| 22–25 | Stage articles, commit, **ONE** `safeoutputs___create_pull_request` | 07 | -Trim scope before quality. Never open a second PR to "save" partial work — there is no second PR. +Trim scope before quality. Never open a second PR within a run — there is no second PR. ## Inputs - `article_date` — override date (defaults to today) -- `force_generation` — regenerate even if today's article exists (analysis is always refreshed regardless) +- `force_generation` — regenerate even if today's article exists; also forces analysis re-run - `languages` — core content languages (default `en,sv`) - `analysis_depth` — `standard` | `deep` (default) | `comprehensive` -## Dedup & analysis-only path +## Run-mode selection -If articles for `$ARTICLE_DATE` + `monthly-review` already exist **and** `force_generation=false`: +At the start of every run, the pre-flight check in `03-data-download.md` detects whether `analysis/daily/$ARTICLE_DATE/monthly-review/` already contains all 9 core artifacts: -- Still run the full analysis pipeline (modules 03 → 04 → 05). -- Commit the analysis. -- Open the single PR with title `📊 Analysis Only — Monthly Review — $ARTICLE_DATE` and label `analysis-only`. +- **No analysis found** → Analysis mode: download data, run Pass 1 + Pass 2 + Tier-C Gate (14 artifacts), commit analysis artifacts, open `analysis-only` PR, stop. +- **Analysis found** → Article mode: read existing analysis, generate articles, commit articles, open articles PR + dispatch `news-translate`. -Analysis is the primary product — a run never "does nothing" just because articles exist. +Repeated runs for the same `$ARTICLE_DATE` always use the same analysis folder when `force_generation=false`. Analysis is the primary product — a run never produces nothing. All other rules (bash format, AWF shell safety, MCP access, download pipeline, analysis methodology & gate, article generation, commit & PR policy) live in the imported modules. diff --git a/.github/workflows/news-motions.lock.yml b/.github/workflows/news-motions.lock.yml index 049b1b476..085d6d462 100644 --- a/.github/workflows/news-motions.lock.yml +++ b/.github/workflows/news-motions.lock.yml @@ -1,4 +1,4 @@ -# gh-aw-metadata: {"schema_version":"v3","frontmatter_hash":"87a02ea5489fdb3c025fcde6dacf95304e98f381a077a17c03bd38a79a74a564","compiler_version":"v0.69.3","agent_id":"copilot","agent_model":"claude-opus-4.7"} +# gh-aw-metadata: {"schema_version":"v3","frontmatter_hash":"76455016144ef8dfcbf0232170e3156824c92c6673f51b64f4db178f4961c7c2","compiler_version":"v0.69.3","agent_id":"copilot","agent_model":"claude-opus-4.7"} # gh-aw-manifest: {"version":1,"secrets":["COPILOT_GITHUB_TOKEN","GH_AW_CI_TRIGGER_TOKEN","GH_AW_GITHUB_MCP_SERVER_TOKEN","GH_AW_GITHUB_TOKEN","GITHUB_TOKEN"],"actions":[{"repo":"actions/checkout","sha":"de0fac2e4500dabe0009e67214ff5f5447ce83dd","version":"v6.0.2"},{"repo":"actions/download-artifact","sha":"3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c","version":"v8.0.1"},{"repo":"actions/github-script","sha":"373c709c69115d41ff229c7e5df9f8788daa9553","version":"v9"},{"repo":"actions/setup-node","sha":"6044e13b5dc448c55e2357c09f80417699197238","version":"6044e13b5dc448c55e2357c09f80417699197238"},{"repo":"actions/upload-artifact","sha":"043fb46d1a93c77aae656e7c1c64a875d1fc6a0a","version":"v7.0.1"},{"repo":"github/gh-aw-actions/setup","sha":"006ffd856b868b71df342dbe0ba082a963249b31","version":"v0.69.3"}],"containers":[{"image":"alpine:latest"},{"image":"ghcr.io/github/gh-aw-firewall/agent:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/api-proxy:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/squid:0.25.26"},{"image":"ghcr.io/github/gh-aw-mcpg:v0.2.26"},{"image":"ghcr.io/github/github-mcp-server:v1.0.0"},{"image":"node:25-alpine"},{"image":"node:lts-alpine"}]} # ___ _ _ # / _ \ | | (_) @@ -203,26 +203,24 @@ jobs: GH_AW_GITHUB_REPOSITORY: ${{ github.repository }} GH_AW_GITHUB_RUN_ID: ${{ github.run_id }} GH_AW_GITHUB_WORKSPACE: ${{ github.workspace }} - GH_AW_WIKI_NOTE: ${{ '' }} # poutine:ignore untrusted_checkout_exec run: | bash "${RUNNER_TEMP}/gh-aw/actions/create_prompt_first.sh" { - cat << 'GH_AW_PROMPT_6ab3c59bd1f4ebba_EOF' + cat << 'GH_AW_PROMPT_b6efdd3a52912758_EOF' - GH_AW_PROMPT_6ab3c59bd1f4ebba_EOF + GH_AW_PROMPT_b6efdd3a52912758_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/xpia.md" cat "${RUNNER_TEMP}/gh-aw/prompts/temp_folder_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/markdown.md" cat "${RUNNER_TEMP}/gh-aw/prompts/agentic_workflows_guide.md" - cat "${RUNNER_TEMP}/gh-aw/prompts/repo_memory_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/safe_outputs_prompt.md" - cat << 'GH_AW_PROMPT_6ab3c59bd1f4ebba_EOF' + cat << 'GH_AW_PROMPT_b6efdd3a52912758_EOF' Tools: add_comment, create_pull_request, dispatch_workflow, missing_tool, missing_data, noop - GH_AW_PROMPT_6ab3c59bd1f4ebba_EOF + GH_AW_PROMPT_b6efdd3a52912758_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/safe_outputs_create_pull_request.md" - cat << 'GH_AW_PROMPT_6ab3c59bd1f4ebba_EOF' + cat << 'GH_AW_PROMPT_b6efdd3a52912758_EOF' The following GitHub context information is available for this workflow: @@ -252,9 +250,9 @@ jobs: {{/if}} - GH_AW_PROMPT_6ab3c59bd1f4ebba_EOF + GH_AW_PROMPT_b6efdd3a52912758_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/github_mcp_tools_with_safeoutputs_prompt.md" - cat << 'GH_AW_PROMPT_6ab3c59bd1f4ebba_EOF' + cat << 'GH_AW_PROMPT_b6efdd3a52912758_EOF' {{#runtime-import .github/prompts/00-base-contract.md}} {{#runtime-import .github/prompts/01-bash-and-shell-safety.md}} @@ -265,7 +263,7 @@ jobs: {{#runtime-import .github/prompts/06-article-generation.md}} {{#runtime-import .github/prompts/07-commit-and-pr.md}} {{#runtime-import .github/workflows/news-motions.md}} - GH_AW_PROMPT_6ab3c59bd1f4ebba_EOF + GH_AW_PROMPT_b6efdd3a52912758_EOF } > "$GH_AW_PROMPT" - name: Interpolate variables and render templates uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 @@ -289,12 +287,6 @@ jobs: GH_AW_GITHUB_REPOSITORY: ${{ github.repository }} GH_AW_GITHUB_RUN_ID: ${{ github.run_id }} GH_AW_GITHUB_WORKSPACE: ${{ github.workspace }} - GH_AW_MEMORY_BRANCH_NAME: 'memory/news-generation' - GH_AW_MEMORY_CONSTRAINTS: "\n\n**Constraints:**\n- **Max File Size**: 51200 bytes (0.05 MB) per file\n- **Max File Count**: 50 files per commit\n- **Max Patch Size**: 51200 bytes (50 KB) total per push (max: 100 KB)\n" - GH_AW_MEMORY_DESCRIPTION: '' - GH_AW_MEMORY_DIR: '/tmp/gh-aw/repo-memory/default/' - GH_AW_MEMORY_TARGET_REPO: ' of the current repository' - GH_AW_WIKI_NOTE: '' with: script: | const { setupGlobals } = require('${{ runner.temp }}/gh-aw/actions/setup_globals.cjs'); @@ -313,13 +305,7 @@ jobs: GH_AW_GITHUB_EVENT_PULL_REQUEST_NUMBER: process.env.GH_AW_GITHUB_EVENT_PULL_REQUEST_NUMBER, GH_AW_GITHUB_REPOSITORY: process.env.GH_AW_GITHUB_REPOSITORY, GH_AW_GITHUB_RUN_ID: process.env.GH_AW_GITHUB_RUN_ID, - GH_AW_GITHUB_WORKSPACE: process.env.GH_AW_GITHUB_WORKSPACE, - GH_AW_MEMORY_BRANCH_NAME: process.env.GH_AW_MEMORY_BRANCH_NAME, - GH_AW_MEMORY_CONSTRAINTS: process.env.GH_AW_MEMORY_CONSTRAINTS, - GH_AW_MEMORY_DESCRIPTION: process.env.GH_AW_MEMORY_DESCRIPTION, - GH_AW_MEMORY_DIR: process.env.GH_AW_MEMORY_DIR, - GH_AW_MEMORY_TARGET_REPO: process.env.GH_AW_MEMORY_TARGET_REPO, - GH_AW_WIKI_NOTE: process.env.GH_AW_WIKI_NOTE + GH_AW_GITHUB_WORKSPACE: process.env.GH_AW_GITHUB_WORKSPACE } }); - name: Validate prompt placeholders @@ -414,16 +400,6 @@ jobs: - name: Pre-flight external endpoint reachability check (runs before MCP Gateway) run: "echo \"🔍 Network Diagnostics — $(date -u '+%Y-%m-%dT%H:%M:%SZ')\"\necho \"═══════════════════════════════════════════\"\necho \"\"\necho \"📡 DNS Resolution Tests:\"\nfor domain in riksdag-regering-ai.onrender.com api.scb.se api.worldbank.org data.riksdagen.se www.riksdagen.se www.regeringen.se; do\n if nslookup \"$domain\" >/dev/null 2>&1; then\n IP=$(nslookup \"$domain\" 2>/dev/null | grep -A1 \"Name:\" | grep \"Address:\" | head -1 | awk '{print $2}')\n echo \" ✅ $domain → $IP\"\n else\n echo \" ❌ $domain — DNS FAILED\"\n fi\ndone\necho \"\"\necho \"🌐 HTTPS Connectivity Tests:\"\nfor url in \\\n \"https://riksdag-regering-ai.onrender.com/mcp\" \\\n \"https://api.scb.se/OV0104/v2beta\" \\\n \"https://api.worldbank.org/v2/country/SE?format=json\" \\\n \"https://data.riksdagen.se/dokumentlista/?sok=test&doktyp=bet&utformat=json&a=1\" \\\n; do\n HTTP_CODE=$(curl -s -o /dev/null -w \"%{http_code}\" --max-time 10 \"$url\" 2>/dev/null || echo \"000\")\n DOMAIN=$(echo \"$url\" | sed 's|https://||' | cut -d/ -f1)\n if [ \"$HTTP_CODE\" -ge 200 ] && [ \"$HTTP_CODE\" -lt 400 ]; then\n echo \" ✅ $DOMAIN → HTTP $HTTP_CODE\"\n elif [ \"$HTTP_CODE\" = \"000\" ]; then\n echo \" ❌ $DOMAIN → TIMEOUT/UNREACHABLE\"\n else\n echo \" ⚠️ $DOMAIN → HTTP $HTTP_CODE\"\n fi\ndone\necho \"\"\necho \"🔌 MCP Server Tool Count:\"\nTOOL_RESP=$(curl -sf --max-time 15 -X POST \\\n -H \"Content-Type: application/json\" \\\n -d '{\"jsonrpc\":\"2.0\",\"id\":1,\"method\":\"tools/list\",\"params\":{}}' \\\n \"https://riksdag-regering-ai.onrender.com/mcp\" 2>/dev/null) || TOOL_RESP=\"\"\nif echo \"$TOOL_RESP\" | grep -q '\"tools\"'; then\n TOOL_COUNT=$(echo \"$TOOL_RESP\" | grep -o '\"name\"' | wc -l)\n echo \" ✅ riksdag-regering MCP: $TOOL_COUNT tools registered\"\nelse\n echo \" ❌ riksdag-regering MCP: No tools response (server may still be starting)\"\nfi\necho \"\"\necho \"═══════════════════════════════════════════\"\n" - # Repo memory git-based storage configuration from frontmatter processed below - - name: Clone repo-memory branch (default) - env: - GH_TOKEN: ${{ github.token }} - GITHUB_SERVER_URL: ${{ github.server_url }} - BRANCH_NAME: memory/news-generation - TARGET_REPO: ${{ github.repository }} - MEMORY_DIR: /tmp/gh-aw/repo-memory/default - CREATE_ORPHAN: true - run: bash "${RUNNER_TEMP}/gh-aw/actions/clone_repo_memory_branch.sh" - name: Configure Git credentials env: REPO_NAME: ${{ github.repository }} @@ -498,9 +474,9 @@ jobs: mkdir -p "${RUNNER_TEMP}/gh-aw/safeoutputs" mkdir -p /tmp/gh-aw/safeoutputs mkdir -p /tmp/gh-aw/mcp-logs/safeoutputs - cat > "${RUNNER_TEMP}/gh-aw/safeoutputs/config.json" << 'GH_AW_SAFE_OUTPUTS_CONFIG_2544c0f641f092c4_EOF' - {"add_comment":{"max":1},"create_pull_request":{"draft":false,"expires":336,"labels":["agentic-news","analysis-data"],"max":1,"max_patch_size":4096,"protected_files":["package.json","bun.lockb","bunfig.toml","deno.json","deno.jsonc","deno.lock","global.json","NuGet.Config","Directory.Packages.props","mix.exs","mix.lock","go.mod","go.sum","stack.yaml","stack.yaml.lock","pom.xml","build.gradle","build.gradle.kts","settings.gradle","settings.gradle.kts","gradle.properties","package-lock.json","yarn.lock","pnpm-lock.yaml","npm-shrinkwrap.json","requirements.txt","Pipfile","Pipfile.lock","pyproject.toml","setup.py","setup.cfg","Gemfile","Gemfile.lock","uv.lock","CODEOWNERS","AGENTS.md","CLAUDE.md","GEMINI.md"],"protected_path_prefixes":[".github/",".agents/"]},"create_report_incomplete_issue":{},"dispatch_workflow":{"max":1,"workflow_files":{"news-translate":".lock.yml"},"workflows":["news-translate"]},"missing_data":{},"missing_tool":{},"noop":{"max":1,"report-as-issue":"true"},"push_repo_memory":{"memories":[{"dir":"/tmp/gh-aw/repo-memory/default","id":"default","max_file_count":50,"max_file_size":51200,"max_patch_size":51200}]},"report_incomplete":{}} - GH_AW_SAFE_OUTPUTS_CONFIG_2544c0f641f092c4_EOF + cat > "${RUNNER_TEMP}/gh-aw/safeoutputs/config.json" << 'GH_AW_SAFE_OUTPUTS_CONFIG_2038ffc57582f341_EOF' + {"add_comment":{"max":1},"create_pull_request":{"draft":false,"expires":336,"labels":["agentic-news","analysis-data"],"max":1,"max_patch_size":4096,"protected_files":["package.json","bun.lockb","bunfig.toml","deno.json","deno.jsonc","deno.lock","global.json","NuGet.Config","Directory.Packages.props","mix.exs","mix.lock","go.mod","go.sum","stack.yaml","stack.yaml.lock","pom.xml","build.gradle","build.gradle.kts","settings.gradle","settings.gradle.kts","gradle.properties","package-lock.json","yarn.lock","pnpm-lock.yaml","npm-shrinkwrap.json","requirements.txt","Pipfile","Pipfile.lock","pyproject.toml","setup.py","setup.cfg","Gemfile","Gemfile.lock","uv.lock","CODEOWNERS","AGENTS.md","CLAUDE.md","GEMINI.md"],"protected_path_prefixes":[".github/",".agents/"]},"create_report_incomplete_issue":{},"dispatch_workflow":{"max":1,"workflow_files":{"news-translate":".lock.yml"},"workflows":["news-translate"]},"missing_data":{},"missing_tool":{},"noop":{"max":1,"report-as-issue":"true"},"report_incomplete":{}} + GH_AW_SAFE_OUTPUTS_CONFIG_2038ffc57582f341_EOF - name: Write Safe Outputs Tools env: GH_AW_TOOLS_META_JSON: | @@ -766,7 +742,7 @@ jobs: mkdir -p /home/runner/.copilot GH_AW_NODE=$(which node 2>/dev/null || command -v node 2>/dev/null || echo node) - cat << GH_AW_MCP_CONFIG_578e1dd627e6339d_EOF | "$GH_AW_NODE" "${RUNNER_TEMP}/gh-aw/actions/start_mcp_gateway.cjs" + cat << GH_AW_MCP_CONFIG_6991d196a1624e97_EOF | "$GH_AW_NODE" "${RUNNER_TEMP}/gh-aw/actions/start_mcp_gateway.cjs" { "mcpServers": { "agenticworkflows": { @@ -879,10 +855,11 @@ jobs: "port": $MCP_GATEWAY_PORT, "domain": "${MCP_GATEWAY_DOMAIN}", "apiKey": "${MCP_GATEWAY_API_KEY}", - "payloadDir": "${MCP_GATEWAY_PAYLOAD_DIR}" + "payloadDir": "${MCP_GATEWAY_PAYLOAD_DIR}", + "keepaliveInterval": 300 } } - GH_AW_MCP_CONFIG_578e1dd627e6339d_EOF + GH_AW_MCP_CONFIG_6991d196a1624e97_EOF - name: Download activation artifact uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1 with: @@ -900,7 +877,7 @@ jobs: - name: Execute GitHub Copilot CLI id: agentic_execution # Copilot CLI tool arguments (sorted): - timeout-minutes: 60 + timeout-minutes: 55 run: | set -o pipefail touch /tmp/gh-aw/agent-step-summary.md @@ -1058,15 +1035,6 @@ jobs: if [ ! -f /tmp/gh-aw/agent_output.json ]; then echo '{"items":[]}' > /tmp/gh-aw/agent_output.json fi - # Upload repo memory as artifacts for push job - - name: Upload repo-memory artifact (default) - if: always() - uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1 - with: - name: repo-memory-default - path: /tmp/gh-aw/repo-memory/default - retention-days: 1 - if-no-files-found: ignore - name: Upload agent artifacts if: always() continue-on-error: true @@ -1095,7 +1063,6 @@ jobs: - activation - agent - detection - - push_repo_memory - safe_outputs if: > always() && (needs.agent.result != 'skipped' || needs.activation.outputs.lockdown_check_failed == 'true' || @@ -1220,13 +1187,9 @@ jobs: GH_AW_CODE_PUSH_FAILURE_COUNT: ${{ needs.safe_outputs.outputs.code_push_failure_count }} GH_AW_LOCKDOWN_CHECK_FAILED: ${{ needs.activation.outputs.lockdown_check_failed }} GH_AW_STALE_LOCK_FILE_FAILED: ${{ needs.activation.outputs.stale_lock_file_failed }} - GH_AW_PUSH_REPO_MEMORY_RESULT: ${{ needs.push_repo_memory.result }} - GH_AW_REPO_MEMORY_VALIDATION_FAILED_default: ${{ needs.push_repo_memory.outputs.validation_failed_default }} - GH_AW_REPO_MEMORY_VALIDATION_ERROR_default: ${{ needs.push_repo_memory.outputs.validation_error_default }} - GH_AW_REPO_MEMORY_PATCH_SIZE_EXCEEDED_default: ${{ needs.push_repo_memory.outputs.patch_size_exceeded_default }} GH_AW_GROUP_REPORTS: "false" GH_AW_FAILURE_REPORT_AS_ISSUE: "true" - GH_AW_TIMEOUT_MINUTES: "60" + GH_AW_TIMEOUT_MINUTES: "55" with: github-token: ${{ secrets.GH_AW_GITHUB_TOKEN || secrets.GITHUB_TOKEN }} script: | @@ -1395,79 +1358,6 @@ jobs: const { main } = require('${{ runner.temp }}/gh-aw/actions/parse_threat_detection_results.cjs'); await main(); - push_repo_memory: - needs: - - activation - - agent - - detection - if: > - always() && (!cancelled()) && (needs.detection.result == 'success' || needs.detection.result == 'skipped') && - needs.agent.result != 'skipped' - runs-on: ubuntu-slim - permissions: - contents: write - concurrency: - group: "push-repo-memory-${{ github.repository }}|memory/news-generation" - cancel-in-progress: false - outputs: - patch_size_exceeded_default: ${{ steps.push_repo_memory_default.outputs.patch_size_exceeded }} - validation_error_default: ${{ steps.push_repo_memory_default.outputs.validation_error }} - validation_failed_default: ${{ steps.push_repo_memory_default.outputs.validation_failed }} - steps: - - name: Setup Scripts - id: setup - uses: github/gh-aw-actions/setup@006ffd856b868b71df342dbe0ba082a963249b31 # v0.69.3 - with: - destination: ${{ runner.temp }}/gh-aw/actions - job-name: ${{ github.job }} - trace-id: ${{ needs.activation.outputs.setup-trace-id }} - - name: Checkout repository - uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2 - with: - persist-credentials: false - sparse-checkout: . - - name: Configure Git credentials - env: - REPO_NAME: ${{ github.repository }} - SERVER_URL: ${{ github.server_url }} - GITHUB_TOKEN: ${{ github.token }} - run: | - git config --global user.email "github-actions[bot]@users.noreply.github.com" - git config --global user.name "github-actions[bot]" - git config --global am.keepcr true - # Re-authenticate git with GitHub token - SERVER_URL_STRIPPED="${SERVER_URL#https://}" - git remote set-url origin "https://x-access-token:${GITHUB_TOKEN}@${SERVER_URL_STRIPPED}/${REPO_NAME}.git" - echo "Git configured with standard GitHub Actions identity" - - name: Download repo-memory artifact (default) - uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1 - continue-on-error: true - with: - name: repo-memory-default - path: /tmp/gh-aw/repo-memory/default - - name: Push repo-memory changes (default) - id: push_repo_memory_default - if: always() - uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 - env: - GH_TOKEN: ${{ github.token }} - GITHUB_RUN_ID: ${{ github.run_id }} - GITHUB_SERVER_URL: ${{ github.server_url }} - ARTIFACT_DIR: /tmp/gh-aw/repo-memory/default - MEMORY_ID: default - TARGET_REPO: ${{ github.repository }} - BRANCH_NAME: memory/news-generation - MAX_FILE_SIZE: 51200 - MAX_FILE_COUNT: 50 - MAX_PATCH_SIZE: 51200 - ALLOWED_EXTENSIONS: '[".md",".json"]' - with: - script: | - const { setupGlobals } = require('${{ runner.temp }}/gh-aw/actions/setup_globals.cjs'); - setupGlobals(core, github, context, exec, io, getOctokit); - const { main } = require('${{ runner.temp }}/gh-aw/actions/push_repo_memory.cjs'); - await main(); - safe_outputs: needs: - activation diff --git a/.github/workflows/news-motions.md b/.github/workflows/news-motions.md index 501ef761c..81fda9a57 100644 --- a/.github/workflows/news-motions.md +++ b/.github/workflows/news-motions.md @@ -40,12 +40,21 @@ permissions: discussions: read security-events: read -timeout-minutes: 60 +timeout-minutes: 55 concurrency: group: gh-aw-news-motions-${{ inputs.article_date || 'today' }} cancel-in-progress: false +features: + mcp-gateway: true + +sandbox: + agent: awf + mcp: + port: 8080 + keepalive-interval: 300 # 5m ping to keep MCP connections alive; Copilot API token expires ~60min so PR must be created within 25min of agent start + runtimes: node: version: "25" @@ -97,12 +106,6 @@ tools: - all agentic-workflows: true bash: true - repo-memory: - branch-name: memory/news-generation - allowed-extensions: [".md", ".json"] - max-file-size: 51200 - max-file-count: 50 - max-patch-size: 51200 safe-outputs: allowed-domains: @@ -228,38 +231,47 @@ Generates deep political intelligence articles on opposition motions in core lan - **Article type**: `motions` - **Analysis subfolder**: `analysis/daily/$ARTICLE_DATE/motions/` - **Core languages produced**: `en`, `sv` (remaining 12 languages dispatched to `news-translate`) -- **One pull request per run** containing analysis + articles + visualisation data. +- **Two-run model**: Run 1 produces an `analysis-only` PR; Run 2 (next scheduled run, same day) detects existing analysis and produces an articles PR. + +## Time budget + +**Run 1 — Analysis mode** (no prior analysis found, ~43 min): + +| Minutes | Phase | Module | +|---------|-------|--------| +| 0–2 | MCP pre-warm + pre-flight analysis check | 02 / 03 | +| 2–7 | Download data + catalogue | 03 | +| 7–27 | Analysis Pass 1 (methodology read + per-doc analyses + 9 artifacts) | 04 | +| 27–38 | Analysis Pass 2 (read-back + improvements) | 04 | +| 38–40 | Analysis Gate | 05 | +| 40–43 | Stage analysis, commit, **ONE** `safeoutputs___create_pull_request` (analysis-only) | 07 | -## Time budget (60 min, minimum 45 min of real work) +**Run 2 — Article mode** (analysis exists on disk, ~25 min): | Minutes | Phase | Module | |---------|-------|--------| -| 0–2 | MCP pre-warm + `get_sync_status` | 02 | -| 2–6 | Download data + catalogue | 03 | -| 6–25 | Analysis Pass 1 (methodology read + per-doc analyses + 9 artifacts) | 04 | -| 25–35 | Analysis Pass 2 (read-back + improvements) | 04 | -| 35–37 | Analysis Gate | 05 | -| 37–48 | Article Pass 1 + Pass 2 (EN, SV) | 06 | -| 48–55 | Visual + link validation | 06 | -| 55–60 | Stage, commit, **ONE** `safeoutputs___create_pull_request` | 07 | +| 0–2 | MCP pre-warm + pre-flight check (SKIP_ANALYSIS=true) | 02 / 03 | +| 2–5 | Read all 9 analysis artifacts into context | 06 | +| 5–18 | Article Pass 1 + Pass 2 (EN, SV) | 06 | +| 18–22 | Visual + link validation | 06 | +| 22–25 | Stage articles, commit, **ONE** `safeoutputs___create_pull_request` | 07 | -Trim scope before quality. Never open a second PR to "save" partial work — there is no second PR. +Trim scope before quality. Never open a second PR within a run — there is no second PR. ## Inputs - `article_date` — override date (defaults to today) -- `force_generation` — regenerate even if today's article exists (analysis is always refreshed regardless) +- `force_generation` — regenerate even if today's article exists; also forces analysis re-run - `languages` — core content languages (default `en,sv`) - `analysis_depth` — `standard` | `deep` (default) | `comprehensive` -## Dedup & analysis-only path +## Run-mode selection -If articles for `$ARTICLE_DATE` + `motions` already exist **and** `force_generation=false`: +At the start of every run, the pre-flight check in `03-data-download.md` detects whether `analysis/daily/$ARTICLE_DATE/motions/` already contains all 9 required artifacts: -- Still run the full analysis pipeline (modules 03 → 04 → 05). -- Commit the analysis. -- Open the single PR with title `📊 Analysis Only — Opposition Motions — $ARTICLE_DATE` and label `analysis-only`. +- **No analysis found** → Analysis mode: download data, run Pass 1 + Pass 2 + Gate, commit analysis artifacts, open `analysis-only` PR, stop. +- **Analysis found** → Article mode: read existing analysis, generate articles, commit articles, open articles PR + dispatch `news-translate`. -Analysis is the primary product — a run never "does nothing" just because articles exist. +Repeated runs for the same `$ARTICLE_DATE` always use the same analysis folder when `force_generation=false`. Analysis is the primary product — a run never produces nothing. All other rules (bash format, AWF shell safety, MCP access, download pipeline, analysis methodology & gate, article generation, commit & PR policy) live in the imported modules. diff --git a/.github/workflows/news-propositions.lock.yml b/.github/workflows/news-propositions.lock.yml index 018699fe6..8cbce194b 100644 --- a/.github/workflows/news-propositions.lock.yml +++ b/.github/workflows/news-propositions.lock.yml @@ -1,4 +1,4 @@ -# gh-aw-metadata: {"schema_version":"v3","frontmatter_hash":"2a8b4b58de9bb9d5945610c91ca9ea2075bd51f86c626342cc201679c4bf9007","compiler_version":"v0.69.3","agent_id":"copilot","agent_model":"claude-opus-4.7"} +# gh-aw-metadata: {"schema_version":"v3","frontmatter_hash":"b0a413b865047182c4c0620a32414af7e70fe9f8de064875a6b227a6362a8872","compiler_version":"v0.69.3","agent_id":"copilot","agent_model":"claude-opus-4.7"} # gh-aw-manifest: {"version":1,"secrets":["COPILOT_GITHUB_TOKEN","GH_AW_CI_TRIGGER_TOKEN","GH_AW_GITHUB_MCP_SERVER_TOKEN","GH_AW_GITHUB_TOKEN","GITHUB_TOKEN"],"actions":[{"repo":"actions/checkout","sha":"de0fac2e4500dabe0009e67214ff5f5447ce83dd","version":"v6.0.2"},{"repo":"actions/download-artifact","sha":"3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c","version":"v8.0.1"},{"repo":"actions/github-script","sha":"373c709c69115d41ff229c7e5df9f8788daa9553","version":"v9"},{"repo":"actions/setup-node","sha":"6044e13b5dc448c55e2357c09f80417699197238","version":"6044e13b5dc448c55e2357c09f80417699197238"},{"repo":"actions/upload-artifact","sha":"043fb46d1a93c77aae656e7c1c64a875d1fc6a0a","version":"v7.0.1"},{"repo":"github/gh-aw-actions/setup","sha":"006ffd856b868b71df342dbe0ba082a963249b31","version":"v0.69.3"}],"containers":[{"image":"alpine:latest"},{"image":"ghcr.io/github/gh-aw-firewall/agent:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/api-proxy:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/squid:0.25.26"},{"image":"ghcr.io/github/gh-aw-mcpg:v0.2.26"},{"image":"ghcr.io/github/github-mcp-server:v1.0.0"},{"image":"node:25-alpine"},{"image":"node:lts-alpine"}]} # ___ _ _ # / _ \ | | (_) @@ -203,26 +203,24 @@ jobs: GH_AW_GITHUB_REPOSITORY: ${{ github.repository }} GH_AW_GITHUB_RUN_ID: ${{ github.run_id }} GH_AW_GITHUB_WORKSPACE: ${{ github.workspace }} - GH_AW_WIKI_NOTE: ${{ '' }} # poutine:ignore untrusted_checkout_exec run: | bash "${RUNNER_TEMP}/gh-aw/actions/create_prompt_first.sh" { - cat << 'GH_AW_PROMPT_956a02d6e412074e_EOF' + cat << 'GH_AW_PROMPT_6795e666bae0ccc7_EOF' - GH_AW_PROMPT_956a02d6e412074e_EOF + GH_AW_PROMPT_6795e666bae0ccc7_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/xpia.md" cat "${RUNNER_TEMP}/gh-aw/prompts/temp_folder_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/markdown.md" cat "${RUNNER_TEMP}/gh-aw/prompts/agentic_workflows_guide.md" - cat "${RUNNER_TEMP}/gh-aw/prompts/repo_memory_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/safe_outputs_prompt.md" - cat << 'GH_AW_PROMPT_956a02d6e412074e_EOF' + cat << 'GH_AW_PROMPT_6795e666bae0ccc7_EOF' Tools: add_comment, create_pull_request, dispatch_workflow, missing_tool, missing_data, noop - GH_AW_PROMPT_956a02d6e412074e_EOF + GH_AW_PROMPT_6795e666bae0ccc7_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/safe_outputs_create_pull_request.md" - cat << 'GH_AW_PROMPT_956a02d6e412074e_EOF' + cat << 'GH_AW_PROMPT_6795e666bae0ccc7_EOF' The following GitHub context information is available for this workflow: @@ -252,9 +250,9 @@ jobs: {{/if}} - GH_AW_PROMPT_956a02d6e412074e_EOF + GH_AW_PROMPT_6795e666bae0ccc7_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/github_mcp_tools_with_safeoutputs_prompt.md" - cat << 'GH_AW_PROMPT_956a02d6e412074e_EOF' + cat << 'GH_AW_PROMPT_6795e666bae0ccc7_EOF' {{#runtime-import .github/prompts/00-base-contract.md}} {{#runtime-import .github/prompts/01-bash-and-shell-safety.md}} @@ -265,7 +263,7 @@ jobs: {{#runtime-import .github/prompts/06-article-generation.md}} {{#runtime-import .github/prompts/07-commit-and-pr.md}} {{#runtime-import .github/workflows/news-propositions.md}} - GH_AW_PROMPT_956a02d6e412074e_EOF + GH_AW_PROMPT_6795e666bae0ccc7_EOF } > "$GH_AW_PROMPT" - name: Interpolate variables and render templates uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 @@ -289,12 +287,6 @@ jobs: GH_AW_GITHUB_REPOSITORY: ${{ github.repository }} GH_AW_GITHUB_RUN_ID: ${{ github.run_id }} GH_AW_GITHUB_WORKSPACE: ${{ github.workspace }} - GH_AW_MEMORY_BRANCH_NAME: 'memory/news-generation' - GH_AW_MEMORY_CONSTRAINTS: "\n\n**Constraints:**\n- **Max File Size**: 51200 bytes (0.05 MB) per file\n- **Max File Count**: 50 files per commit\n- **Max Patch Size**: 51200 bytes (50 KB) total per push (max: 100 KB)\n" - GH_AW_MEMORY_DESCRIPTION: '' - GH_AW_MEMORY_DIR: '/tmp/gh-aw/repo-memory/default/' - GH_AW_MEMORY_TARGET_REPO: ' of the current repository' - GH_AW_WIKI_NOTE: '' with: script: | const { setupGlobals } = require('${{ runner.temp }}/gh-aw/actions/setup_globals.cjs'); @@ -313,13 +305,7 @@ jobs: GH_AW_GITHUB_EVENT_PULL_REQUEST_NUMBER: process.env.GH_AW_GITHUB_EVENT_PULL_REQUEST_NUMBER, GH_AW_GITHUB_REPOSITORY: process.env.GH_AW_GITHUB_REPOSITORY, GH_AW_GITHUB_RUN_ID: process.env.GH_AW_GITHUB_RUN_ID, - GH_AW_GITHUB_WORKSPACE: process.env.GH_AW_GITHUB_WORKSPACE, - GH_AW_MEMORY_BRANCH_NAME: process.env.GH_AW_MEMORY_BRANCH_NAME, - GH_AW_MEMORY_CONSTRAINTS: process.env.GH_AW_MEMORY_CONSTRAINTS, - GH_AW_MEMORY_DESCRIPTION: process.env.GH_AW_MEMORY_DESCRIPTION, - GH_AW_MEMORY_DIR: process.env.GH_AW_MEMORY_DIR, - GH_AW_MEMORY_TARGET_REPO: process.env.GH_AW_MEMORY_TARGET_REPO, - GH_AW_WIKI_NOTE: process.env.GH_AW_WIKI_NOTE + GH_AW_GITHUB_WORKSPACE: process.env.GH_AW_GITHUB_WORKSPACE } }); - name: Validate prompt placeholders @@ -414,16 +400,6 @@ jobs: - name: Pre-flight external endpoint reachability check (runs before MCP Gateway) run: "echo \"🔍 Network Diagnostics — $(date -u '+%Y-%m-%dT%H:%M:%SZ')\"\necho \"═══════════════════════════════════════════\"\necho \"\"\necho \"📡 DNS Resolution Tests:\"\nfor domain in riksdag-regering-ai.onrender.com api.scb.se api.worldbank.org data.riksdagen.se www.riksdagen.se www.regeringen.se; do\n if nslookup \"$domain\" >/dev/null 2>&1; then\n IP=$(nslookup \"$domain\" 2>/dev/null | grep -A1 \"Name:\" | grep \"Address:\" | head -1 | awk '{print $2}')\n echo \" ✅ $domain → $IP\"\n else\n echo \" ❌ $domain — DNS FAILED\"\n fi\ndone\necho \"\"\necho \"🌐 HTTPS Connectivity Tests:\"\nfor url in \\\n \"https://riksdag-regering-ai.onrender.com/mcp\" \\\n \"https://api.scb.se/OV0104/v2beta\" \\\n \"https://api.worldbank.org/v2/country/SE?format=json\" \\\n \"https://data.riksdagen.se/dokumentlista/?sok=test&doktyp=bet&utformat=json&a=1\" \\\n; do\n HTTP_CODE=$(curl -s -o /dev/null -w \"%{http_code}\" --max-time 10 \"$url\" 2>/dev/null || echo \"000\")\n DOMAIN=$(echo \"$url\" | sed 's|https://||' | cut -d/ -f1)\n if [ \"$HTTP_CODE\" -ge 200 ] && [ \"$HTTP_CODE\" -lt 400 ]; then\n echo \" ✅ $DOMAIN → HTTP $HTTP_CODE\"\n elif [ \"$HTTP_CODE\" = \"000\" ]; then\n echo \" ❌ $DOMAIN → TIMEOUT/UNREACHABLE\"\n else\n echo \" ⚠️ $DOMAIN → HTTP $HTTP_CODE\"\n fi\ndone\necho \"\"\necho \"🔌 MCP Server Tool Count:\"\nTOOL_RESP=$(curl -sf --max-time 15 -X POST \\\n -H \"Content-Type: application/json\" \\\n -d '{\"jsonrpc\":\"2.0\",\"id\":1,\"method\":\"tools/list\",\"params\":{}}' \\\n \"https://riksdag-regering-ai.onrender.com/mcp\" 2>/dev/null) || TOOL_RESP=\"\"\nif echo \"$TOOL_RESP\" | grep -q '\"tools\"'; then\n TOOL_COUNT=$(echo \"$TOOL_RESP\" | grep -o '\"name\"' | wc -l)\n echo \" ✅ riksdag-regering MCP: $TOOL_COUNT tools registered\"\nelse\n echo \" ❌ riksdag-regering MCP: No tools response (server may still be starting)\"\nfi\necho \"\"\necho \"═══════════════════════════════════════════\"\n" - # Repo memory git-based storage configuration from frontmatter processed below - - name: Clone repo-memory branch (default) - env: - GH_TOKEN: ${{ github.token }} - GITHUB_SERVER_URL: ${{ github.server_url }} - BRANCH_NAME: memory/news-generation - TARGET_REPO: ${{ github.repository }} - MEMORY_DIR: /tmp/gh-aw/repo-memory/default - CREATE_ORPHAN: true - run: bash "${RUNNER_TEMP}/gh-aw/actions/clone_repo_memory_branch.sh" - name: Configure Git credentials env: REPO_NAME: ${{ github.repository }} @@ -498,9 +474,9 @@ jobs: mkdir -p "${RUNNER_TEMP}/gh-aw/safeoutputs" mkdir -p /tmp/gh-aw/safeoutputs mkdir -p /tmp/gh-aw/mcp-logs/safeoutputs - cat > "${RUNNER_TEMP}/gh-aw/safeoutputs/config.json" << 'GH_AW_SAFE_OUTPUTS_CONFIG_adeff6bf6d40e8b2_EOF' - {"add_comment":{"max":1},"create_pull_request":{"draft":false,"expires":336,"labels":["agentic-news","analysis-data"],"max":1,"max_patch_size":4096,"protected_files":["package.json","bun.lockb","bunfig.toml","deno.json","deno.jsonc","deno.lock","global.json","NuGet.Config","Directory.Packages.props","mix.exs","mix.lock","go.mod","go.sum","stack.yaml","stack.yaml.lock","pom.xml","build.gradle","build.gradle.kts","settings.gradle","settings.gradle.kts","gradle.properties","package-lock.json","yarn.lock","pnpm-lock.yaml","npm-shrinkwrap.json","requirements.txt","Pipfile","Pipfile.lock","pyproject.toml","setup.py","setup.cfg","Gemfile","Gemfile.lock","uv.lock","CODEOWNERS","AGENTS.md","CLAUDE.md","GEMINI.md"],"protected_path_prefixes":[".github/",".agents/"]},"create_report_incomplete_issue":{},"dispatch_workflow":{"max":1,"workflow_files":{"news-translate":".lock.yml"},"workflows":["news-translate"]},"missing_data":{},"missing_tool":{},"noop":{"max":1,"report-as-issue":"true"},"push_repo_memory":{"memories":[{"dir":"/tmp/gh-aw/repo-memory/default","id":"default","max_file_count":50,"max_file_size":51200,"max_patch_size":51200}]},"report_incomplete":{}} - GH_AW_SAFE_OUTPUTS_CONFIG_adeff6bf6d40e8b2_EOF + cat > "${RUNNER_TEMP}/gh-aw/safeoutputs/config.json" << 'GH_AW_SAFE_OUTPUTS_CONFIG_2e5cff0a56f2dde7_EOF' + {"add_comment":{"max":1},"create_pull_request":{"draft":false,"expires":336,"labels":["agentic-news","analysis-data"],"max":1,"max_patch_size":4096,"protected_files":["package.json","bun.lockb","bunfig.toml","deno.json","deno.jsonc","deno.lock","global.json","NuGet.Config","Directory.Packages.props","mix.exs","mix.lock","go.mod","go.sum","stack.yaml","stack.yaml.lock","pom.xml","build.gradle","build.gradle.kts","settings.gradle","settings.gradle.kts","gradle.properties","package-lock.json","yarn.lock","pnpm-lock.yaml","npm-shrinkwrap.json","requirements.txt","Pipfile","Pipfile.lock","pyproject.toml","setup.py","setup.cfg","Gemfile","Gemfile.lock","uv.lock","CODEOWNERS","AGENTS.md","CLAUDE.md","GEMINI.md"],"protected_path_prefixes":[".github/",".agents/"]},"create_report_incomplete_issue":{},"dispatch_workflow":{"max":1,"workflow_files":{"news-translate":".lock.yml"},"workflows":["news-translate"]},"missing_data":{},"missing_tool":{},"noop":{"max":1,"report-as-issue":"true"},"report_incomplete":{}} + GH_AW_SAFE_OUTPUTS_CONFIG_2e5cff0a56f2dde7_EOF - name: Write Safe Outputs Tools env: GH_AW_TOOLS_META_JSON: | @@ -766,7 +742,7 @@ jobs: mkdir -p /home/runner/.copilot GH_AW_NODE=$(which node 2>/dev/null || command -v node 2>/dev/null || echo node) - cat << GH_AW_MCP_CONFIG_0fcf314c19f952a3_EOF | "$GH_AW_NODE" "${RUNNER_TEMP}/gh-aw/actions/start_mcp_gateway.cjs" + cat << GH_AW_MCP_CONFIG_4c94da8583598477_EOF | "$GH_AW_NODE" "${RUNNER_TEMP}/gh-aw/actions/start_mcp_gateway.cjs" { "mcpServers": { "agenticworkflows": { @@ -879,10 +855,11 @@ jobs: "port": $MCP_GATEWAY_PORT, "domain": "${MCP_GATEWAY_DOMAIN}", "apiKey": "${MCP_GATEWAY_API_KEY}", - "payloadDir": "${MCP_GATEWAY_PAYLOAD_DIR}" + "payloadDir": "${MCP_GATEWAY_PAYLOAD_DIR}", + "keepaliveInterval": 300 } } - GH_AW_MCP_CONFIG_0fcf314c19f952a3_EOF + GH_AW_MCP_CONFIG_4c94da8583598477_EOF - name: Download activation artifact uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1 with: @@ -900,7 +877,7 @@ jobs: - name: Execute GitHub Copilot CLI id: agentic_execution # Copilot CLI tool arguments (sorted): - timeout-minutes: 60 + timeout-minutes: 55 run: | set -o pipefail touch /tmp/gh-aw/agent-step-summary.md @@ -1058,15 +1035,6 @@ jobs: if [ ! -f /tmp/gh-aw/agent_output.json ]; then echo '{"items":[]}' > /tmp/gh-aw/agent_output.json fi - # Upload repo memory as artifacts for push job - - name: Upload repo-memory artifact (default) - if: always() - uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1 - with: - name: repo-memory-default - path: /tmp/gh-aw/repo-memory/default - retention-days: 1 - if-no-files-found: ignore - name: Upload agent artifacts if: always() continue-on-error: true @@ -1095,7 +1063,6 @@ jobs: - activation - agent - detection - - push_repo_memory - safe_outputs if: > always() && (needs.agent.result != 'skipped' || needs.activation.outputs.lockdown_check_failed == 'true' || @@ -1220,13 +1187,9 @@ jobs: GH_AW_CODE_PUSH_FAILURE_COUNT: ${{ needs.safe_outputs.outputs.code_push_failure_count }} GH_AW_LOCKDOWN_CHECK_FAILED: ${{ needs.activation.outputs.lockdown_check_failed }} GH_AW_STALE_LOCK_FILE_FAILED: ${{ needs.activation.outputs.stale_lock_file_failed }} - GH_AW_PUSH_REPO_MEMORY_RESULT: ${{ needs.push_repo_memory.result }} - GH_AW_REPO_MEMORY_VALIDATION_FAILED_default: ${{ needs.push_repo_memory.outputs.validation_failed_default }} - GH_AW_REPO_MEMORY_VALIDATION_ERROR_default: ${{ needs.push_repo_memory.outputs.validation_error_default }} - GH_AW_REPO_MEMORY_PATCH_SIZE_EXCEEDED_default: ${{ needs.push_repo_memory.outputs.patch_size_exceeded_default }} GH_AW_GROUP_REPORTS: "false" GH_AW_FAILURE_REPORT_AS_ISSUE: "true" - GH_AW_TIMEOUT_MINUTES: "60" + GH_AW_TIMEOUT_MINUTES: "55" with: github-token: ${{ secrets.GH_AW_GITHUB_TOKEN || secrets.GITHUB_TOKEN }} script: | @@ -1395,79 +1358,6 @@ jobs: const { main } = require('${{ runner.temp }}/gh-aw/actions/parse_threat_detection_results.cjs'); await main(); - push_repo_memory: - needs: - - activation - - agent - - detection - if: > - always() && (!cancelled()) && (needs.detection.result == 'success' || needs.detection.result == 'skipped') && - needs.agent.result != 'skipped' - runs-on: ubuntu-slim - permissions: - contents: write - concurrency: - group: "push-repo-memory-${{ github.repository }}|memory/news-generation" - cancel-in-progress: false - outputs: - patch_size_exceeded_default: ${{ steps.push_repo_memory_default.outputs.patch_size_exceeded }} - validation_error_default: ${{ steps.push_repo_memory_default.outputs.validation_error }} - validation_failed_default: ${{ steps.push_repo_memory_default.outputs.validation_failed }} - steps: - - name: Setup Scripts - id: setup - uses: github/gh-aw-actions/setup@006ffd856b868b71df342dbe0ba082a963249b31 # v0.69.3 - with: - destination: ${{ runner.temp }}/gh-aw/actions - job-name: ${{ github.job }} - trace-id: ${{ needs.activation.outputs.setup-trace-id }} - - name: Checkout repository - uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2 - with: - persist-credentials: false - sparse-checkout: . - - name: Configure Git credentials - env: - REPO_NAME: ${{ github.repository }} - SERVER_URL: ${{ github.server_url }} - GITHUB_TOKEN: ${{ github.token }} - run: | - git config --global user.email "github-actions[bot]@users.noreply.github.com" - git config --global user.name "github-actions[bot]" - git config --global am.keepcr true - # Re-authenticate git with GitHub token - SERVER_URL_STRIPPED="${SERVER_URL#https://}" - git remote set-url origin "https://x-access-token:${GITHUB_TOKEN}@${SERVER_URL_STRIPPED}/${REPO_NAME}.git" - echo "Git configured with standard GitHub Actions identity" - - name: Download repo-memory artifact (default) - uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1 - continue-on-error: true - with: - name: repo-memory-default - path: /tmp/gh-aw/repo-memory/default - - name: Push repo-memory changes (default) - id: push_repo_memory_default - if: always() - uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 - env: - GH_TOKEN: ${{ github.token }} - GITHUB_RUN_ID: ${{ github.run_id }} - GITHUB_SERVER_URL: ${{ github.server_url }} - ARTIFACT_DIR: /tmp/gh-aw/repo-memory/default - MEMORY_ID: default - TARGET_REPO: ${{ github.repository }} - BRANCH_NAME: memory/news-generation - MAX_FILE_SIZE: 51200 - MAX_FILE_COUNT: 50 - MAX_PATCH_SIZE: 51200 - ALLOWED_EXTENSIONS: '[".md",".json"]' - with: - script: | - const { setupGlobals } = require('${{ runner.temp }}/gh-aw/actions/setup_globals.cjs'); - setupGlobals(core, github, context, exec, io, getOctokit); - const { main } = require('${{ runner.temp }}/gh-aw/actions/push_repo_memory.cjs'); - await main(); - safe_outputs: needs: - activation diff --git a/.github/workflows/news-propositions.md b/.github/workflows/news-propositions.md index 81b842fd9..214babde0 100644 --- a/.github/workflows/news-propositions.md +++ b/.github/workflows/news-propositions.md @@ -40,12 +40,21 @@ permissions: discussions: read security-events: read -timeout-minutes: 60 +timeout-minutes: 55 concurrency: group: gh-aw-news-propositions-${{ inputs.article_date || 'today' }} cancel-in-progress: false +features: + mcp-gateway: true + +sandbox: + agent: awf + mcp: + port: 8080 + keepalive-interval: 300 # 5m ping to keep MCP connections alive; Copilot API token expires ~60min so PR must be created within 25min of agent start + runtimes: node: version: "25" @@ -97,12 +106,6 @@ tools: - all agentic-workflows: true bash: true - repo-memory: - branch-name: memory/news-generation - allowed-extensions: [".md", ".json"] - max-file-size: 51200 - max-file-count: 50 - max-patch-size: 51200 safe-outputs: allowed-domains: @@ -228,38 +231,47 @@ Generates deep political intelligence articles on Swedish government proposition - **Article type**: `propositions` - **Analysis subfolder**: `analysis/daily/$ARTICLE_DATE/propositions/` - **Core languages produced**: `en`, `sv` (remaining 12 languages dispatched to `news-translate`) -- **One pull request per run** containing analysis + articles + visualisation data. +- **Two-run model**: Run 1 produces an `analysis-only` PR; Run 2 (next scheduled run, same day) detects existing analysis and produces an articles PR. + +## Time budget + +**Run 1 — Analysis mode** (no prior analysis found, ~43 min): + +| Minutes | Phase | Module | +|---------|-------|--------| +| 0–2 | MCP pre-warm + pre-flight analysis check | 02 / 03 | +| 2–7 | Download data + catalogue | 03 | +| 7–27 | Analysis Pass 1 (methodology read + per-doc analyses + 9 artifacts) | 04 | +| 27–38 | Analysis Pass 2 (read-back + improvements) | 04 | +| 38–40 | Analysis Gate | 05 | +| 40–43 | Stage analysis, commit, **ONE** `safeoutputs___create_pull_request` (analysis-only) | 07 | -## Time budget (60 min, minimum 45 min of real work) +**Run 2 — Article mode** (analysis exists on disk, ~25 min): | Minutes | Phase | Module | |---------|-------|--------| -| 0–2 | MCP pre-warm + `get_sync_status` | 02 | -| 2–6 | Download data + catalogue | 03 | -| 6–25 | Analysis Pass 1 (methodology read + per-doc analyses + 9 artifacts) | 04 | -| 25–35 | Analysis Pass 2 (read-back + improvements) | 04 | -| 35–37 | Analysis Gate | 05 | -| 37–48 | Article Pass 1 + Pass 2 (EN, SV) | 06 | -| 48–55 | Visual + link validation | 06 | -| 55–60 | Stage, commit, **ONE** `safeoutputs___create_pull_request` | 07 | +| 0–2 | MCP pre-warm + pre-flight check (SKIP_ANALYSIS=true) | 02 / 03 | +| 2–5 | Read all 9 analysis artifacts into context | 06 | +| 5–18 | Article Pass 1 + Pass 2 (EN, SV) | 06 | +| 18–22 | Visual + link validation | 06 | +| 22–25 | Stage articles, commit, **ONE** `safeoutputs___create_pull_request` | 07 | -Trim scope before quality. Never open a second PR to "save" partial work — there is no second PR. +Trim scope before quality. Never open a second PR within a run — there is no second PR. ## Inputs - `article_date` — override date (defaults to today) -- `force_generation` — regenerate even if today's article exists (analysis is always refreshed regardless) +- `force_generation` — regenerate even if today's article exists; also forces analysis re-run - `languages` — core content languages (default `en,sv`) - `analysis_depth` — `standard` | `deep` (default) | `comprehensive` -## Dedup & analysis-only path +## Run-mode selection -If articles for `$ARTICLE_DATE` + `propositions` already exist **and** `force_generation=false`: +At the start of every run, the pre-flight check in `03-data-download.md` detects whether `analysis/daily/$ARTICLE_DATE/propositions/` already contains all 9 required artifacts: -- Still run the full analysis pipeline (modules 03 → 04 → 05). -- Commit the analysis. -- Open the single PR with title `📊 Analysis Only — Government Propositions — $ARTICLE_DATE` and label `analysis-only`. +- **No analysis found** → Analysis mode: download data, run Pass 1 + Pass 2 + Gate, commit analysis artifacts, open `analysis-only` PR, stop. +- **Analysis found** → Article mode: read existing analysis, generate articles, commit articles, open articles PR + dispatch `news-translate`. -Analysis is the primary product — a run never "does nothing" just because articles exist. +Repeated runs for the same `$ARTICLE_DATE` always use the same analysis folder when `force_generation=false`. Analysis is the primary product — a run never produces nothing. All other rules (bash format, AWF shell safety, MCP access, download pipeline, analysis methodology & gate, article generation, commit & PR policy) live in the imported modules. diff --git a/.github/workflows/news-realtime-monitor.lock.yml b/.github/workflows/news-realtime-monitor.lock.yml index e742d6625..60c098238 100644 --- a/.github/workflows/news-realtime-monitor.lock.yml +++ b/.github/workflows/news-realtime-monitor.lock.yml @@ -1,4 +1,4 @@ -# gh-aw-metadata: {"schema_version":"v3","frontmatter_hash":"f5aa4b831845f568be00dd07bdc273d21cc4ebb6e76fcca95c69c5e1ef422c8c","compiler_version":"v0.69.3","agent_id":"copilot","agent_model":"claude-opus-4.7"} +# gh-aw-metadata: {"schema_version":"v3","frontmatter_hash":"959dbc298bfb5bb52db527eb26640882c7a3a357801ba807432a91b0fcfef47c","compiler_version":"v0.69.3","agent_id":"copilot","agent_model":"claude-opus-4.7"} # gh-aw-manifest: {"version":1,"secrets":["COPILOT_GITHUB_TOKEN","GH_AW_CI_TRIGGER_TOKEN","GH_AW_GITHUB_MCP_SERVER_TOKEN","GH_AW_GITHUB_TOKEN","GITHUB_TOKEN"],"actions":[{"repo":"actions/checkout","sha":"de0fac2e4500dabe0009e67214ff5f5447ce83dd","version":"v6.0.2"},{"repo":"actions/download-artifact","sha":"3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c","version":"v8.0.1"},{"repo":"actions/github-script","sha":"373c709c69115d41ff229c7e5df9f8788daa9553","version":"v9"},{"repo":"actions/setup-node","sha":"6044e13b5dc448c55e2357c09f80417699197238","version":"6044e13b5dc448c55e2357c09f80417699197238"},{"repo":"actions/upload-artifact","sha":"043fb46d1a93c77aae656e7c1c64a875d1fc6a0a","version":"v7.0.1"},{"repo":"github/gh-aw-actions/setup","sha":"006ffd856b868b71df342dbe0ba082a963249b31","version":"v0.69.3"}],"containers":[{"image":"alpine:latest"},{"image":"ghcr.io/github/gh-aw-firewall/agent:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/api-proxy:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/squid:0.25.26"},{"image":"ghcr.io/github/gh-aw-mcpg:v0.2.26"},{"image":"ghcr.io/github/github-mcp-server:v1.0.0"},{"image":"mcr.microsoft.com/playwright/mcp"},{"image":"node:25-alpine"},{"image":"node:lts-alpine"}]} # ___ _ _ # / _ \ | | (_) @@ -209,27 +209,25 @@ jobs: GH_AW_GITHUB_REPOSITORY: ${{ github.repository }} GH_AW_GITHUB_RUN_ID: ${{ github.run_id }} GH_AW_GITHUB_WORKSPACE: ${{ github.workspace }} - GH_AW_WIKI_NOTE: ${{ '' }} # poutine:ignore untrusted_checkout_exec run: | bash "${RUNNER_TEMP}/gh-aw/actions/create_prompt_first.sh" { - cat << 'GH_AW_PROMPT_cf77de82423850f2_EOF' + cat << 'GH_AW_PROMPT_34164eb951119646_EOF' - GH_AW_PROMPT_cf77de82423850f2_EOF + GH_AW_PROMPT_34164eb951119646_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/xpia.md" cat "${RUNNER_TEMP}/gh-aw/prompts/temp_folder_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/markdown.md" cat "${RUNNER_TEMP}/gh-aw/prompts/playwright_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/agentic_workflows_guide.md" - cat "${RUNNER_TEMP}/gh-aw/prompts/repo_memory_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/safe_outputs_prompt.md" - cat << 'GH_AW_PROMPT_cf77de82423850f2_EOF' + cat << 'GH_AW_PROMPT_34164eb951119646_EOF' Tools: add_comment, create_pull_request, dispatch_workflow, missing_tool, missing_data, noop - GH_AW_PROMPT_cf77de82423850f2_EOF + GH_AW_PROMPT_34164eb951119646_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/safe_outputs_create_pull_request.md" - cat << 'GH_AW_PROMPT_cf77de82423850f2_EOF' + cat << 'GH_AW_PROMPT_34164eb951119646_EOF' The following GitHub context information is available for this workflow: @@ -259,9 +257,9 @@ jobs: {{/if}} - GH_AW_PROMPT_cf77de82423850f2_EOF + GH_AW_PROMPT_34164eb951119646_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/github_mcp_tools_with_safeoutputs_prompt.md" - cat << 'GH_AW_PROMPT_cf77de82423850f2_EOF' + cat << 'GH_AW_PROMPT_34164eb951119646_EOF' {{#runtime-import .github/prompts/00-base-contract.md}} {{#runtime-import .github/prompts/01-bash-and-shell-safety.md}} @@ -273,7 +271,7 @@ jobs: {{#runtime-import .github/prompts/07-commit-and-pr.md}} {{#runtime-import .github/prompts/ext/tier-c-aggregation.md}} {{#runtime-import .github/workflows/news-realtime-monitor.md}} - GH_AW_PROMPT_cf77de82423850f2_EOF + GH_AW_PROMPT_34164eb951119646_EOF } > "$GH_AW_PROMPT" - name: Interpolate variables and render templates uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 @@ -297,12 +295,6 @@ jobs: GH_AW_GITHUB_REPOSITORY: ${{ github.repository }} GH_AW_GITHUB_RUN_ID: ${{ github.run_id }} GH_AW_GITHUB_WORKSPACE: ${{ github.workspace }} - GH_AW_MEMORY_BRANCH_NAME: 'memory/news-generation' - GH_AW_MEMORY_CONSTRAINTS: "\n\n**Constraints:**\n- **Max File Size**: 51200 bytes (0.05 MB) per file\n- **Max File Count**: 50 files per commit\n- **Max Patch Size**: 51200 bytes (50 KB) total per push (max: 100 KB)\n" - GH_AW_MEMORY_DESCRIPTION: '' - GH_AW_MEMORY_DIR: '/tmp/gh-aw/repo-memory/default/' - GH_AW_MEMORY_TARGET_REPO: ' of the current repository' - GH_AW_WIKI_NOTE: '' with: script: | const { setupGlobals } = require('${{ runner.temp }}/gh-aw/actions/setup_globals.cjs'); @@ -321,13 +313,7 @@ jobs: GH_AW_GITHUB_EVENT_PULL_REQUEST_NUMBER: process.env.GH_AW_GITHUB_EVENT_PULL_REQUEST_NUMBER, GH_AW_GITHUB_REPOSITORY: process.env.GH_AW_GITHUB_REPOSITORY, GH_AW_GITHUB_RUN_ID: process.env.GH_AW_GITHUB_RUN_ID, - GH_AW_GITHUB_WORKSPACE: process.env.GH_AW_GITHUB_WORKSPACE, - GH_AW_MEMORY_BRANCH_NAME: process.env.GH_AW_MEMORY_BRANCH_NAME, - GH_AW_MEMORY_CONSTRAINTS: process.env.GH_AW_MEMORY_CONSTRAINTS, - GH_AW_MEMORY_DESCRIPTION: process.env.GH_AW_MEMORY_DESCRIPTION, - GH_AW_MEMORY_DIR: process.env.GH_AW_MEMORY_DIR, - GH_AW_MEMORY_TARGET_REPO: process.env.GH_AW_MEMORY_TARGET_REPO, - GH_AW_WIKI_NOTE: process.env.GH_AW_WIKI_NOTE + GH_AW_GITHUB_WORKSPACE: process.env.GH_AW_GITHUB_WORKSPACE } }); - name: Validate prompt placeholders @@ -422,16 +408,6 @@ jobs: - name: Pre-flight external endpoint reachability check (runs before MCP Gateway) run: "echo \"🔍 Network Diagnostics — $(date -u '+%Y-%m-%dT%H:%M:%SZ')\"\necho \"═══════════════════════════════════════════\"\necho \"\"\necho \"📡 DNS Resolution Tests:\"\nfor domain in riksdag-regering-ai.onrender.com api.scb.se api.worldbank.org data.riksdagen.se www.riksdagen.se www.regeringen.se; do\n if nslookup \"$domain\" >/dev/null 2>&1; then\n IP=$(nslookup \"$domain\" 2>/dev/null | grep -A1 \"Name:\" | grep \"Address:\" | head -1 | awk '{print $2}')\n echo \" ✅ $domain → $IP\"\n else\n echo \" ❌ $domain — DNS FAILED\"\n fi\ndone\necho \"\"\necho \"🌐 HTTPS Connectivity Tests:\"\nfor url in \\\n \"https://riksdag-regering-ai.onrender.com/mcp\" \\\n \"https://api.scb.se/OV0104/v2beta\" \\\n \"https://api.worldbank.org/v2/country/SE?format=json\" \\\n \"https://data.riksdagen.se/dokumentlista/?sok=test&doktyp=bet&utformat=json&a=1\" \\\n; do\n HTTP_CODE=$(curl -s -o /dev/null -w \"%{http_code}\" --max-time 10 \"$url\" 2>/dev/null || echo \"000\")\n DOMAIN=$(echo \"$url\" | sed 's|https://||' | cut -d/ -f1)\n if [ \"$HTTP_CODE\" -ge 200 ] && [ \"$HTTP_CODE\" -lt 400 ]; then\n echo \" ✅ $DOMAIN → HTTP $HTTP_CODE\"\n elif [ \"$HTTP_CODE\" = \"000\" ]; then\n echo \" ❌ $DOMAIN → TIMEOUT/UNREACHABLE\"\n else\n echo \" ⚠️ $DOMAIN → HTTP $HTTP_CODE\"\n fi\ndone\necho \"\"\necho \"🔌 MCP Server Tool Count:\"\nTOOL_RESP=$(curl -sf --max-time 15 -X POST \\\n -H \"Content-Type: application/json\" \\\n -d '{\"jsonrpc\":\"2.0\",\"id\":1,\"method\":\"tools/list\",\"params\":{}}' \\\n \"https://riksdag-regering-ai.onrender.com/mcp\" 2>/dev/null) || TOOL_RESP=\"\"\nif echo \"$TOOL_RESP\" | grep -q '\"tools\"'; then\n TOOL_COUNT=$(echo \"$TOOL_RESP\" | grep -o '\"name\"' | wc -l)\n echo \" ✅ riksdag-regering MCP: $TOOL_COUNT tools registered\"\nelse\n echo \" ❌ riksdag-regering MCP: No tools response (server may still be starting)\"\nfi\necho \"\"\necho \"═══════════════════════════════════════════\"\n" - # Repo memory git-based storage configuration from frontmatter processed below - - name: Clone repo-memory branch (default) - env: - GH_TOKEN: ${{ github.token }} - GITHUB_SERVER_URL: ${{ github.server_url }} - BRANCH_NAME: memory/news-generation - TARGET_REPO: ${{ github.repository }} - MEMORY_DIR: /tmp/gh-aw/repo-memory/default - CREATE_ORPHAN: true - run: bash "${RUNNER_TEMP}/gh-aw/actions/clone_repo_memory_branch.sh" - name: Configure Git credentials env: REPO_NAME: ${{ github.repository }} @@ -506,9 +482,9 @@ jobs: mkdir -p "${RUNNER_TEMP}/gh-aw/safeoutputs" mkdir -p /tmp/gh-aw/safeoutputs mkdir -p /tmp/gh-aw/mcp-logs/safeoutputs - cat > "${RUNNER_TEMP}/gh-aw/safeoutputs/config.json" << 'GH_AW_SAFE_OUTPUTS_CONFIG_0b861792875b19ae_EOF' - {"add_comment":{"max":1},"create_pull_request":{"draft":false,"expires":336,"labels":["agentic-news","analysis-data"],"max":1,"max_patch_size":4096,"protected_files":["package.json","bun.lockb","bunfig.toml","deno.json","deno.jsonc","deno.lock","global.json","NuGet.Config","Directory.Packages.props","mix.exs","mix.lock","go.mod","go.sum","stack.yaml","stack.yaml.lock","pom.xml","build.gradle","build.gradle.kts","settings.gradle","settings.gradle.kts","gradle.properties","package-lock.json","yarn.lock","pnpm-lock.yaml","npm-shrinkwrap.json","requirements.txt","Pipfile","Pipfile.lock","pyproject.toml","setup.py","setup.cfg","Gemfile","Gemfile.lock","uv.lock","CODEOWNERS","AGENTS.md","CLAUDE.md","GEMINI.md"],"protected_path_prefixes":[".github/",".agents/"]},"create_report_incomplete_issue":{},"dispatch_workflow":{"max":1,"workflow_files":{"news-translate":".lock.yml"},"workflows":["news-translate"]},"missing_data":{},"missing_tool":{},"noop":{"max":1,"report-as-issue":"true"},"push_repo_memory":{"memories":[{"dir":"/tmp/gh-aw/repo-memory/default","id":"default","max_file_count":50,"max_file_size":51200,"max_patch_size":51200}]},"report_incomplete":{}} - GH_AW_SAFE_OUTPUTS_CONFIG_0b861792875b19ae_EOF + cat > "${RUNNER_TEMP}/gh-aw/safeoutputs/config.json" << 'GH_AW_SAFE_OUTPUTS_CONFIG_de6156ece8d40713_EOF' + {"add_comment":{"max":1},"create_pull_request":{"draft":false,"expires":336,"labels":["agentic-news","analysis-data"],"max":1,"max_patch_size":4096,"protected_files":["package.json","bun.lockb","bunfig.toml","deno.json","deno.jsonc","deno.lock","global.json","NuGet.Config","Directory.Packages.props","mix.exs","mix.lock","go.mod","go.sum","stack.yaml","stack.yaml.lock","pom.xml","build.gradle","build.gradle.kts","settings.gradle","settings.gradle.kts","gradle.properties","package-lock.json","yarn.lock","pnpm-lock.yaml","npm-shrinkwrap.json","requirements.txt","Pipfile","Pipfile.lock","pyproject.toml","setup.py","setup.cfg","Gemfile","Gemfile.lock","uv.lock","CODEOWNERS","AGENTS.md","CLAUDE.md","GEMINI.md"],"protected_path_prefixes":[".github/",".agents/"]},"create_report_incomplete_issue":{},"dispatch_workflow":{"max":1,"workflow_files":{"news-translate":".lock.yml"},"workflows":["news-translate"]},"missing_data":{},"missing_tool":{},"noop":{"max":1,"report-as-issue":"true"},"report_incomplete":{}} + GH_AW_SAFE_OUTPUTS_CONFIG_de6156ece8d40713_EOF - name: Write Safe Outputs Tools env: GH_AW_TOOLS_META_JSON: | @@ -776,7 +752,7 @@ jobs: mkdir -p /home/runner/.copilot GH_AW_NODE=$(which node 2>/dev/null || command -v node 2>/dev/null || echo node) - cat << GH_AW_MCP_CONFIG_5c4d4064e88f8dc5_EOF | "$GH_AW_NODE" "${RUNNER_TEMP}/gh-aw/actions/start_mcp_gateway.cjs" + cat << GH_AW_MCP_CONFIG_3b0b31b35933b1af_EOF | "$GH_AW_NODE" "${RUNNER_TEMP}/gh-aw/actions/start_mcp_gateway.cjs" { "mcpServers": { "agenticworkflows": { @@ -903,10 +879,11 @@ jobs: "port": $MCP_GATEWAY_PORT, "domain": "${MCP_GATEWAY_DOMAIN}", "apiKey": "${MCP_GATEWAY_API_KEY}", - "payloadDir": "${MCP_GATEWAY_PAYLOAD_DIR}" + "payloadDir": "${MCP_GATEWAY_PAYLOAD_DIR}", + "keepaliveInterval": 300 } } - GH_AW_MCP_CONFIG_5c4d4064e88f8dc5_EOF + GH_AW_MCP_CONFIG_3b0b31b35933b1af_EOF - name: Download activation artifact uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1 with: @@ -924,7 +901,7 @@ jobs: - name: Execute GitHub Copilot CLI id: agentic_execution # Copilot CLI tool arguments (sorted): - timeout-minutes: 60 + timeout-minutes: 55 run: | set -o pipefail touch /tmp/gh-aw/agent-step-summary.md @@ -1082,15 +1059,6 @@ jobs: if [ ! -f /tmp/gh-aw/agent_output.json ]; then echo '{"items":[]}' > /tmp/gh-aw/agent_output.json fi - # Upload repo memory as artifacts for push job - - name: Upload repo-memory artifact (default) - if: always() - uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1 - with: - name: repo-memory-default - path: /tmp/gh-aw/repo-memory/default - retention-days: 1 - if-no-files-found: ignore - name: Upload agent artifacts if: always() continue-on-error: true @@ -1119,7 +1087,6 @@ jobs: - activation - agent - detection - - push_repo_memory - safe_outputs if: > always() && (needs.agent.result != 'skipped' || needs.activation.outputs.lockdown_check_failed == 'true' || @@ -1244,13 +1211,9 @@ jobs: GH_AW_CODE_PUSH_FAILURE_COUNT: ${{ needs.safe_outputs.outputs.code_push_failure_count }} GH_AW_LOCKDOWN_CHECK_FAILED: ${{ needs.activation.outputs.lockdown_check_failed }} GH_AW_STALE_LOCK_FILE_FAILED: ${{ needs.activation.outputs.stale_lock_file_failed }} - GH_AW_PUSH_REPO_MEMORY_RESULT: ${{ needs.push_repo_memory.result }} - GH_AW_REPO_MEMORY_VALIDATION_FAILED_default: ${{ needs.push_repo_memory.outputs.validation_failed_default }} - GH_AW_REPO_MEMORY_VALIDATION_ERROR_default: ${{ needs.push_repo_memory.outputs.validation_error_default }} - GH_AW_REPO_MEMORY_PATCH_SIZE_EXCEEDED_default: ${{ needs.push_repo_memory.outputs.patch_size_exceeded_default }} GH_AW_GROUP_REPORTS: "false" GH_AW_FAILURE_REPORT_AS_ISSUE: "true" - GH_AW_TIMEOUT_MINUTES: "60" + GH_AW_TIMEOUT_MINUTES: "55" with: github-token: ${{ secrets.GH_AW_GITHUB_TOKEN || secrets.GITHUB_TOKEN }} script: | @@ -1419,79 +1382,6 @@ jobs: const { main } = require('${{ runner.temp }}/gh-aw/actions/parse_threat_detection_results.cjs'); await main(); - push_repo_memory: - needs: - - activation - - agent - - detection - if: > - always() && (!cancelled()) && (needs.detection.result == 'success' || needs.detection.result == 'skipped') && - needs.agent.result != 'skipped' - runs-on: ubuntu-slim - permissions: - contents: write - concurrency: - group: "push-repo-memory-${{ github.repository }}|memory/news-generation" - cancel-in-progress: false - outputs: - patch_size_exceeded_default: ${{ steps.push_repo_memory_default.outputs.patch_size_exceeded }} - validation_error_default: ${{ steps.push_repo_memory_default.outputs.validation_error }} - validation_failed_default: ${{ steps.push_repo_memory_default.outputs.validation_failed }} - steps: - - name: Setup Scripts - id: setup - uses: github/gh-aw-actions/setup@006ffd856b868b71df342dbe0ba082a963249b31 # v0.69.3 - with: - destination: ${{ runner.temp }}/gh-aw/actions - job-name: ${{ github.job }} - trace-id: ${{ needs.activation.outputs.setup-trace-id }} - - name: Checkout repository - uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2 - with: - persist-credentials: false - sparse-checkout: . - - name: Configure Git credentials - env: - REPO_NAME: ${{ github.repository }} - SERVER_URL: ${{ github.server_url }} - GITHUB_TOKEN: ${{ github.token }} - run: | - git config --global user.email "github-actions[bot]@users.noreply.github.com" - git config --global user.name "github-actions[bot]" - git config --global am.keepcr true - # Re-authenticate git with GitHub token - SERVER_URL_STRIPPED="${SERVER_URL#https://}" - git remote set-url origin "https://x-access-token:${GITHUB_TOKEN}@${SERVER_URL_STRIPPED}/${REPO_NAME}.git" - echo "Git configured with standard GitHub Actions identity" - - name: Download repo-memory artifact (default) - uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1 - continue-on-error: true - with: - name: repo-memory-default - path: /tmp/gh-aw/repo-memory/default - - name: Push repo-memory changes (default) - id: push_repo_memory_default - if: always() - uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 - env: - GH_TOKEN: ${{ github.token }} - GITHUB_RUN_ID: ${{ github.run_id }} - GITHUB_SERVER_URL: ${{ github.server_url }} - ARTIFACT_DIR: /tmp/gh-aw/repo-memory/default - MEMORY_ID: default - TARGET_REPO: ${{ github.repository }} - BRANCH_NAME: memory/news-generation - MAX_FILE_SIZE: 51200 - MAX_FILE_COUNT: 50 - MAX_PATCH_SIZE: 51200 - ALLOWED_EXTENSIONS: '[".md",".json"]' - with: - script: | - const { setupGlobals } = require('${{ runner.temp }}/gh-aw/actions/setup_globals.cjs'); - setupGlobals(core, github, context, exec, io, getOctokit); - const { main } = require('${{ runner.temp }}/gh-aw/actions/push_repo_memory.cjs'); - await main(); - safe_outputs: needs: - activation diff --git a/.github/workflows/news-realtime-monitor.md b/.github/workflows/news-realtime-monitor.md index d5ea519b0..f2b526b9d 100644 --- a/.github/workflows/news-realtime-monitor.md +++ b/.github/workflows/news-realtime-monitor.md @@ -51,12 +51,21 @@ permissions: discussions: read security-events: read -timeout-minutes: 60 +timeout-minutes: 55 concurrency: group: gh-aw-news-realtime-monitor-${{ inputs.article_date || 'today' }} cancel-in-progress: false +features: + mcp-gateway: true + +sandbox: + agent: awf + mcp: + port: 8080 + keepalive-interval: 300 # 5m ping to keep MCP connections alive; Copilot API token expires ~60min so PR must be created within 25min of agent start + runtimes: node: version: "25" @@ -109,12 +118,6 @@ tools: agentic-workflows: true bash: true playwright: - repo-memory: - branch-name: memory/news-generation - allowed-extensions: [".md", ".json"] - max-file-size: 51200 - max-file-count: 50 - max-patch-size: 51200 safe-outputs: allowed-domains: @@ -244,38 +247,47 @@ This workflow imports `../prompts/ext/tier-c-aggregation.md`. Produce **all 14 a - **Article type**: `breaking` - **Analysis subfolder**: `analysis/daily/$ARTICLE_DATE/realtime-$HHMM/` - **Core languages produced**: `en`, `sv` (remaining 12 languages dispatched to `news-translate`) -- **One pull request per run** containing analysis + articles + visualisation data. +- **Two-run model**: Run 1 produces an `analysis-only` PR (14 artifacts); Run 2 (next scheduled run, same day) detects existing analysis and produces an articles PR. Note: realtime runs use a time-stamped subfolder (`realtime-HHMM`) so morning and afternoon runs each have independent analysis folders. + +## Time budget + +**Run 1 — Analysis mode** (no prior analysis found, ~45 min): + +| Minutes | Phase | Module | +|---------|-------|--------| +| 0–2 | MCP pre-warm + pre-flight analysis check | 02 / 03 | +| 2–7 | Download data + catalogue | 03 | +| 7–27 | Analysis Pass 1 (methodology read + per-doc analyses + 14 artifacts incl. 5 Tier-C) | 04 / ext | +| 27–38 | Analysis Pass 2 (read-back + improvements) | 04 | +| 38–41 | Analysis Gate (Tier-C extended gate) | 05 | +| 41–45 | Stage analysis, commit, **ONE** `safeoutputs___create_pull_request` (analysis-only) | 07 | -## Time budget (60 min, minimum 45 min of real work) +**Run 2 — Article mode** (analysis exists on disk, ~25 min): | Minutes | Phase | Module | |---------|-------|--------| -| 0–2 | MCP pre-warm + `get_sync_status` | 02 | -| 2–6 | Download data + catalogue | 03 | -| 6–25 | Analysis Pass 1 (methodology read + per-doc analyses + 9 artifacts) | 04 | -| 25–35 | Analysis Pass 2 (read-back + improvements) | 04 | -| 35–37 | Analysis Gate | 05 | -| 37–48 | Article Pass 1 + Pass 2 (EN, SV) | 06 | -| 48–55 | Visual + link validation | 06 | -| 55–60 | Stage, commit, **ONE** `safeoutputs___create_pull_request` | 07 | +| 0–2 | MCP pre-warm + pre-flight check (SKIP_ANALYSIS=true) | 02 / 03 | +| 2–5 | Read all 14 analysis artifacts into context | 06 | +| 5–18 | Article Pass 1 + Pass 2 (EN, SV) | 06 | +| 18–22 | Visual + link validation | 06 | +| 22–25 | Stage articles, commit, **ONE** `safeoutputs___create_pull_request` | 07 | -Trim scope before quality. Never open a second PR to "save" partial work — there is no second PR. +Trim scope before quality. Never open a second PR within a run — there is no second PR. ## Inputs - `article_date` — override date (defaults to today) -- `force_generation` — regenerate even if today's article exists (analysis is always refreshed regardless) +- `force_generation` — regenerate even if today's article exists; also forces analysis re-run - `languages` — core content languages (default `en,sv`) - `analysis_depth` — `standard` | `deep` (default) | `comprehensive` -## Dedup & analysis-only path +## Run-mode selection -If articles for `$ARTICLE_DATE` + `breaking` already exist **and** `force_generation=false`: +At the start of every run, the pre-flight check in `03-data-download.md` detects whether `analysis/daily/$ARTICLE_DATE/realtime-$HHMM/` already contains all 9 core artifacts: -- Still run the full analysis pipeline (modules 03 → 04 → 05). -- Commit the analysis. -- Open the single PR with title `📊 Analysis Only — Realtime Monitor — $ARTICLE_DATE` and label `analysis-only`. +- **No analysis found** → Analysis mode: download data, run Pass 1 + Pass 2 + Tier-C Gate (14 artifacts), commit analysis artifacts, open `analysis-only` PR, stop. +- **Analysis found** → Article mode: read existing analysis, generate articles, commit articles, open articles PR + dispatch `news-translate`. -Analysis is the primary product — a run never "does nothing" just because articles exist. +Repeated runs for the same `$ARTICLE_DATE` always use the same analysis folder when `force_generation=false`. Analysis is the primary product — a run never produces nothing. All other rules (bash format, AWF shell safety, MCP access, download pipeline, analysis methodology & gate, article generation, commit & PR policy) live in the imported modules. diff --git a/.github/workflows/news-translate.lock.yml b/.github/workflows/news-translate.lock.yml index ad345909e..0c4bf6354 100644 --- a/.github/workflows/news-translate.lock.yml +++ b/.github/workflows/news-translate.lock.yml @@ -1,4 +1,4 @@ -# gh-aw-metadata: {"schema_version":"v3","frontmatter_hash":"e0b4de7e3b8000d4d0183e5d5dfc98bc449e515864d6479d5ac7d57c643c239f","compiler_version":"v0.69.3","agent_id":"copilot","agent_model":"claude-opus-4.7"} +# gh-aw-metadata: {"schema_version":"v3","frontmatter_hash":"0c889730a1495c5e2f6ac2176f8b0bd1509dbffc0a2670c4dadb02e67b753534","compiler_version":"v0.69.3","agent_id":"copilot","agent_model":"claude-opus-4.7"} # gh-aw-manifest: {"version":1,"secrets":["COPILOT_GITHUB_TOKEN","GH_AW_CI_TRIGGER_TOKEN","GH_AW_GITHUB_MCP_SERVER_TOKEN","GH_AW_GITHUB_TOKEN","GITHUB_TOKEN"],"actions":[{"repo":"actions/checkout","sha":"de0fac2e4500dabe0009e67214ff5f5447ce83dd","version":"v6.0.2"},{"repo":"actions/download-artifact","sha":"3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c","version":"v8.0.1"},{"repo":"actions/github-script","sha":"373c709c69115d41ff229c7e5df9f8788daa9553","version":"v9"},{"repo":"actions/setup-node","sha":"6044e13b5dc448c55e2357c09f80417699197238","version":"6044e13b5dc448c55e2357c09f80417699197238"},{"repo":"actions/upload-artifact","sha":"043fb46d1a93c77aae656e7c1c64a875d1fc6a0a","version":"v7.0.1"},{"repo":"github/gh-aw-actions/setup","sha":"006ffd856b868b71df342dbe0ba082a963249b31","version":"v0.69.3"}],"containers":[{"image":"alpine:latest"},{"image":"ghcr.io/github/gh-aw-firewall/agent:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/api-proxy:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/squid:0.25.26"},{"image":"ghcr.io/github/gh-aw-mcpg:v0.2.26"},{"image":"ghcr.io/github/github-mcp-server:v1.0.0"},{"image":"node:25-alpine"},{"image":"node:lts-alpine"}]} # ___ _ _ # / _ \ | | (_) @@ -202,26 +202,24 @@ jobs: GH_AW_GITHUB_REPOSITORY: ${{ github.repository }} GH_AW_GITHUB_RUN_ID: ${{ github.run_id }} GH_AW_GITHUB_WORKSPACE: ${{ github.workspace }} - GH_AW_WIKI_NOTE: ${{ '' }} # poutine:ignore untrusted_checkout_exec run: | bash "${RUNNER_TEMP}/gh-aw/actions/create_prompt_first.sh" { - cat << 'GH_AW_PROMPT_a21f108b44ae4e0e_EOF' + cat << 'GH_AW_PROMPT_fa5aab05db3a7066_EOF' - GH_AW_PROMPT_a21f108b44ae4e0e_EOF + GH_AW_PROMPT_fa5aab05db3a7066_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/xpia.md" cat "${RUNNER_TEMP}/gh-aw/prompts/temp_folder_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/markdown.md" cat "${RUNNER_TEMP}/gh-aw/prompts/agentic_workflows_guide.md" - cat "${RUNNER_TEMP}/gh-aw/prompts/repo_memory_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/safe_outputs_prompt.md" - cat << 'GH_AW_PROMPT_a21f108b44ae4e0e_EOF' + cat << 'GH_AW_PROMPT_fa5aab05db3a7066_EOF' Tools: add_comment, create_pull_request, missing_tool, missing_data, noop - GH_AW_PROMPT_a21f108b44ae4e0e_EOF + GH_AW_PROMPT_fa5aab05db3a7066_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/safe_outputs_create_pull_request.md" - cat << 'GH_AW_PROMPT_a21f108b44ae4e0e_EOF' + cat << 'GH_AW_PROMPT_fa5aab05db3a7066_EOF' The following GitHub context information is available for this workflow: @@ -251,16 +249,16 @@ jobs: {{/if}} - GH_AW_PROMPT_a21f108b44ae4e0e_EOF + GH_AW_PROMPT_fa5aab05db3a7066_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/github_mcp_tools_with_safeoutputs_prompt.md" - cat << 'GH_AW_PROMPT_a21f108b44ae4e0e_EOF' + cat << 'GH_AW_PROMPT_fa5aab05db3a7066_EOF' {{#runtime-import .github/prompts/00-base-contract.md}} {{#runtime-import .github/prompts/01-bash-and-shell-safety.md}} {{#runtime-import .github/prompts/02-mcp-access.md}} {{#runtime-import .github/prompts/07-commit-and-pr.md}} {{#runtime-import .github/workflows/news-translate.md}} - GH_AW_PROMPT_a21f108b44ae4e0e_EOF + GH_AW_PROMPT_fa5aab05db3a7066_EOF } > "$GH_AW_PROMPT" - name: Interpolate variables and render templates uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 @@ -284,12 +282,6 @@ jobs: GH_AW_GITHUB_REPOSITORY: ${{ github.repository }} GH_AW_GITHUB_RUN_ID: ${{ github.run_id }} GH_AW_GITHUB_WORKSPACE: ${{ github.workspace }} - GH_AW_MEMORY_BRANCH_NAME: 'memory/news-generation' - GH_AW_MEMORY_CONSTRAINTS: "\n\n**Constraints:**\n- **Max File Size**: 51200 bytes (0.05 MB) per file\n- **Max File Count**: 50 files per commit\n- **Max Patch Size**: 51200 bytes (50 KB) total per push (max: 100 KB)\n" - GH_AW_MEMORY_DESCRIPTION: '' - GH_AW_MEMORY_DIR: '/tmp/gh-aw/repo-memory/default/' - GH_AW_MEMORY_TARGET_REPO: ' of the current repository' - GH_AW_WIKI_NOTE: '' with: script: | const { setupGlobals } = require('${{ runner.temp }}/gh-aw/actions/setup_globals.cjs'); @@ -308,13 +300,7 @@ jobs: GH_AW_GITHUB_EVENT_PULL_REQUEST_NUMBER: process.env.GH_AW_GITHUB_EVENT_PULL_REQUEST_NUMBER, GH_AW_GITHUB_REPOSITORY: process.env.GH_AW_GITHUB_REPOSITORY, GH_AW_GITHUB_RUN_ID: process.env.GH_AW_GITHUB_RUN_ID, - GH_AW_GITHUB_WORKSPACE: process.env.GH_AW_GITHUB_WORKSPACE, - GH_AW_MEMORY_BRANCH_NAME: process.env.GH_AW_MEMORY_BRANCH_NAME, - GH_AW_MEMORY_CONSTRAINTS: process.env.GH_AW_MEMORY_CONSTRAINTS, - GH_AW_MEMORY_DESCRIPTION: process.env.GH_AW_MEMORY_DESCRIPTION, - GH_AW_MEMORY_DIR: process.env.GH_AW_MEMORY_DIR, - GH_AW_MEMORY_TARGET_REPO: process.env.GH_AW_MEMORY_TARGET_REPO, - GH_AW_WIKI_NOTE: process.env.GH_AW_WIKI_NOTE + GH_AW_GITHUB_WORKSPACE: process.env.GH_AW_GITHUB_WORKSPACE } }); - name: Validate prompt placeholders @@ -432,16 +418,6 @@ jobs: fi echo "✅ Found $EN_SOURCE_COUNT EN source article(s) for $ARTICLE_DATE — proceeding with translation" - # Repo memory git-based storage configuration from frontmatter processed below - - name: Clone repo-memory branch (default) - env: - GH_TOKEN: ${{ github.token }} - GITHUB_SERVER_URL: ${{ github.server_url }} - BRANCH_NAME: memory/news-generation - TARGET_REPO: ${{ github.repository }} - MEMORY_DIR: /tmp/gh-aw/repo-memory/default - CREATE_ORPHAN: true - run: bash "${RUNNER_TEMP}/gh-aw/actions/clone_repo_memory_branch.sh" - name: Configure Git credentials env: REPO_NAME: ${{ github.repository }} @@ -516,9 +492,9 @@ jobs: mkdir -p "${RUNNER_TEMP}/gh-aw/safeoutputs" mkdir -p /tmp/gh-aw/safeoutputs mkdir -p /tmp/gh-aw/mcp-logs/safeoutputs - cat > "${RUNNER_TEMP}/gh-aw/safeoutputs/config.json" << 'GH_AW_SAFE_OUTPUTS_CONFIG_3a0487753dbffc3f_EOF' - {"add_comment":{"max":1},"create_pull_request":{"draft":false,"expires":336,"labels":["agentic-news","translation"],"max":1,"max_patch_size":4096,"protected_files":["package.json","bun.lockb","bunfig.toml","deno.json","deno.jsonc","deno.lock","global.json","NuGet.Config","Directory.Packages.props","mix.exs","mix.lock","go.mod","go.sum","stack.yaml","stack.yaml.lock","pom.xml","build.gradle","build.gradle.kts","settings.gradle","settings.gradle.kts","gradle.properties","package-lock.json","yarn.lock","pnpm-lock.yaml","npm-shrinkwrap.json","requirements.txt","Pipfile","Pipfile.lock","pyproject.toml","setup.py","setup.cfg","Gemfile","Gemfile.lock","uv.lock","CODEOWNERS","AGENTS.md","CLAUDE.md","GEMINI.md"],"protected_path_prefixes":[".github/",".agents/"]},"create_report_incomplete_issue":{},"missing_data":{},"missing_tool":{},"noop":{"max":1,"report-as-issue":"true"},"push_repo_memory":{"memories":[{"dir":"/tmp/gh-aw/repo-memory/default","id":"default","max_file_count":50,"max_file_size":51200,"max_patch_size":51200}]},"report_incomplete":{}} - GH_AW_SAFE_OUTPUTS_CONFIG_3a0487753dbffc3f_EOF + cat > "${RUNNER_TEMP}/gh-aw/safeoutputs/config.json" << 'GH_AW_SAFE_OUTPUTS_CONFIG_8a5616f281ed5b2a_EOF' + {"add_comment":{"max":1},"create_pull_request":{"draft":false,"expires":336,"labels":["agentic-news","translation"],"max":1,"max_patch_size":4096,"protected_files":["package.json","bun.lockb","bunfig.toml","deno.json","deno.jsonc","deno.lock","global.json","NuGet.Config","Directory.Packages.props","mix.exs","mix.lock","go.mod","go.sum","stack.yaml","stack.yaml.lock","pom.xml","build.gradle","build.gradle.kts","settings.gradle","settings.gradle.kts","gradle.properties","package-lock.json","yarn.lock","pnpm-lock.yaml","npm-shrinkwrap.json","requirements.txt","Pipfile","Pipfile.lock","pyproject.toml","setup.py","setup.cfg","Gemfile","Gemfile.lock","uv.lock","CODEOWNERS","AGENTS.md","CLAUDE.md","GEMINI.md"],"protected_path_prefixes":[".github/",".agents/"]},"create_report_incomplete_issue":{},"missing_data":{},"missing_tool":{},"noop":{"max":1,"report-as-issue":"true"},"report_incomplete":{}} + GH_AW_SAFE_OUTPUTS_CONFIG_8a5616f281ed5b2a_EOF - name: Write Safe Outputs Tools env: GH_AW_TOOLS_META_JSON: | @@ -749,7 +725,7 @@ jobs: mkdir -p /home/runner/.copilot GH_AW_NODE=$(which node 2>/dev/null || command -v node 2>/dev/null || echo node) - cat << GH_AW_MCP_CONFIG_6a45268f9eb9a59c_EOF | "$GH_AW_NODE" "${RUNNER_TEMP}/gh-aw/actions/start_mcp_gateway.cjs" + cat << GH_AW_MCP_CONFIG_87e2fd7edf103486_EOF | "$GH_AW_NODE" "${RUNNER_TEMP}/gh-aw/actions/start_mcp_gateway.cjs" { "mcpServers": { "agenticworkflows": { @@ -862,10 +838,11 @@ jobs: "port": $MCP_GATEWAY_PORT, "domain": "${MCP_GATEWAY_DOMAIN}", "apiKey": "${MCP_GATEWAY_API_KEY}", - "payloadDir": "${MCP_GATEWAY_PAYLOAD_DIR}" + "payloadDir": "${MCP_GATEWAY_PAYLOAD_DIR}", + "keepaliveInterval": 300 } } - GH_AW_MCP_CONFIG_6a45268f9eb9a59c_EOF + GH_AW_MCP_CONFIG_87e2fd7edf103486_EOF - name: Download activation artifact uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1 with: @@ -883,7 +860,7 @@ jobs: - name: Execute GitHub Copilot CLI id: agentic_execution # Copilot CLI tool arguments (sorted): - timeout-minutes: 60 + timeout-minutes: 55 run: | set -o pipefail touch /tmp/gh-aw/agent-step-summary.md @@ -1041,15 +1018,6 @@ jobs: if [ ! -f /tmp/gh-aw/agent_output.json ]; then echo '{"items":[]}' > /tmp/gh-aw/agent_output.json fi - # Upload repo memory as artifacts for push job - - name: Upload repo-memory artifact (default) - if: always() - uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1 - with: - name: repo-memory-default - path: /tmp/gh-aw/repo-memory/default - retention-days: 1 - if-no-files-found: ignore - name: Upload agent artifacts if: always() continue-on-error: true @@ -1078,7 +1046,6 @@ jobs: - activation - agent - detection - - push_repo_memory - safe_outputs if: > always() && (needs.agent.result != 'skipped' || needs.activation.outputs.lockdown_check_failed == 'true' || @@ -1202,13 +1169,9 @@ jobs: GH_AW_CODE_PUSH_FAILURE_COUNT: ${{ needs.safe_outputs.outputs.code_push_failure_count }} GH_AW_LOCKDOWN_CHECK_FAILED: ${{ needs.activation.outputs.lockdown_check_failed }} GH_AW_STALE_LOCK_FILE_FAILED: ${{ needs.activation.outputs.stale_lock_file_failed }} - GH_AW_PUSH_REPO_MEMORY_RESULT: ${{ needs.push_repo_memory.result }} - GH_AW_REPO_MEMORY_VALIDATION_FAILED_default: ${{ needs.push_repo_memory.outputs.validation_failed_default }} - GH_AW_REPO_MEMORY_VALIDATION_ERROR_default: ${{ needs.push_repo_memory.outputs.validation_error_default }} - GH_AW_REPO_MEMORY_PATCH_SIZE_EXCEEDED_default: ${{ needs.push_repo_memory.outputs.patch_size_exceeded_default }} GH_AW_GROUP_REPORTS: "false" GH_AW_FAILURE_REPORT_AS_ISSUE: "false" - GH_AW_TIMEOUT_MINUTES: "60" + GH_AW_TIMEOUT_MINUTES: "55" with: github-token: ${{ secrets.GH_AW_GITHUB_TOKEN || secrets.GITHUB_TOKEN }} script: | @@ -1377,79 +1340,6 @@ jobs: const { main } = require('${{ runner.temp }}/gh-aw/actions/parse_threat_detection_results.cjs'); await main(); - push_repo_memory: - needs: - - activation - - agent - - detection - if: > - always() && (!cancelled()) && (needs.detection.result == 'success' || needs.detection.result == 'skipped') && - needs.agent.result != 'skipped' - runs-on: ubuntu-slim - permissions: - contents: write - concurrency: - group: "push-repo-memory-${{ github.repository }}|memory/news-generation" - cancel-in-progress: false - outputs: - patch_size_exceeded_default: ${{ steps.push_repo_memory_default.outputs.patch_size_exceeded }} - validation_error_default: ${{ steps.push_repo_memory_default.outputs.validation_error }} - validation_failed_default: ${{ steps.push_repo_memory_default.outputs.validation_failed }} - steps: - - name: Setup Scripts - id: setup - uses: github/gh-aw-actions/setup@006ffd856b868b71df342dbe0ba082a963249b31 # v0.69.3 - with: - destination: ${{ runner.temp }}/gh-aw/actions - job-name: ${{ github.job }} - trace-id: ${{ needs.activation.outputs.setup-trace-id }} - - name: Checkout repository - uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2 - with: - persist-credentials: false - sparse-checkout: . - - name: Configure Git credentials - env: - REPO_NAME: ${{ github.repository }} - SERVER_URL: ${{ github.server_url }} - GITHUB_TOKEN: ${{ github.token }} - run: | - git config --global user.email "github-actions[bot]@users.noreply.github.com" - git config --global user.name "github-actions[bot]" - git config --global am.keepcr true - # Re-authenticate git with GitHub token - SERVER_URL_STRIPPED="${SERVER_URL#https://}" - git remote set-url origin "https://x-access-token:${GITHUB_TOKEN}@${SERVER_URL_STRIPPED}/${REPO_NAME}.git" - echo "Git configured with standard GitHub Actions identity" - - name: Download repo-memory artifact (default) - uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1 - continue-on-error: true - with: - name: repo-memory-default - path: /tmp/gh-aw/repo-memory/default - - name: Push repo-memory changes (default) - id: push_repo_memory_default - if: always() - uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 - env: - GH_TOKEN: ${{ github.token }} - GITHUB_RUN_ID: ${{ github.run_id }} - GITHUB_SERVER_URL: ${{ github.server_url }} - ARTIFACT_DIR: /tmp/gh-aw/repo-memory/default - MEMORY_ID: default - TARGET_REPO: ${{ github.repository }} - BRANCH_NAME: memory/news-generation - MAX_FILE_SIZE: 51200 - MAX_FILE_COUNT: 50 - MAX_PATCH_SIZE: 51200 - ALLOWED_EXTENSIONS: '[".md",".json"]' - with: - script: | - const { setupGlobals } = require('${{ runner.temp }}/gh-aw/actions/setup_globals.cjs'); - setupGlobals(core, github, context, exec, io, getOctokit); - const { main } = require('${{ runner.temp }}/gh-aw/actions/push_repo_memory.cjs'); - await main(); - safe_outputs: needs: - activation diff --git a/.github/workflows/news-translate.md b/.github/workflows/news-translate.md index ad5984293..4ebbb8e95 100644 --- a/.github/workflows/news-translate.md +++ b/.github/workflows/news-translate.md @@ -43,13 +43,22 @@ permissions: discussions: read security-events: read -timeout-minutes: 60 +timeout-minutes: 55 concurrency: group: gh-aw-news-translate-${{ inputs.article_type || 'batch' }}-${{ inputs.article_date || 'today' }} job-discriminator: ${{ inputs.article_type || 'batch' }}-${{ inputs.article_date || 'today' }} cancel-in-progress: true +features: + mcp-gateway: true + +sandbox: + agent: awf + mcp: + port: 8080 + keepalive-interval: 300 # 5m ping to keep MCP connections alive; Copilot API token expires ~60min so PR must be created within 25min of agent start + runtimes: node: version: "25" @@ -101,12 +110,6 @@ tools: - all agentic-workflows: true bash: true - repo-memory: - branch-name: memory/news-generation - allowed-extensions: [".md", ".json"] - max-file-size: 51200 - max-file-count: 50 - max-patch-size: 51200 safe-outputs: report-failure-as-issue: false @@ -323,14 +326,14 @@ Translation is a pure-derivative workflow: - Keep the PR under the safe-outputs 100-file cap. If more translations are pending than fit in one PR, translate the highest-priority batch and leave the rest for the next scheduled run. - Skip any language whose translation already exists and is non-empty unless `force` is explicitly requested. -## Time budget (60 min) +## Time budget (~40 min) | Minutes | Phase | |---------|-------| | 0–2 | MCP pre-warm + date resolution | -| 2–8 | Scan untranslated articles; build work list | -| 8–52 | Translate + validate in priority order (highest-value types first) | -| 52–58 | Final validation, stage, commit | -| 58–60 | **One** `safeoutputs___create_pull_request` call | +| 2–6 | Scan untranslated articles; build work list | +| 6–36 | Translate + validate in priority order (highest-value types first) | +| 36–39 | Final validation, stage, commit | +| 39–41 | **One** `safeoutputs___create_pull_request` call | All non-workflow-specific rules are in the imported modules — do not restate them here. diff --git a/.github/workflows/news-week-ahead.lock.yml b/.github/workflows/news-week-ahead.lock.yml index cb5cd79ce..1f33809e0 100644 --- a/.github/workflows/news-week-ahead.lock.yml +++ b/.github/workflows/news-week-ahead.lock.yml @@ -1,4 +1,4 @@ -# gh-aw-metadata: {"schema_version":"v3","frontmatter_hash":"537e56999f77854e9b504eeaa1a95b7b5ed7b46fcceeb5925db0009eb9dbb689","compiler_version":"v0.69.3","agent_id":"copilot","agent_model":"claude-opus-4.7"} +# gh-aw-metadata: {"schema_version":"v3","frontmatter_hash":"94051ac701b4cdb38b285dd56b6068734b9bffde701d2957e309f9a81ab770ce","compiler_version":"v0.69.3","agent_id":"copilot","agent_model":"claude-opus-4.7"} # gh-aw-manifest: {"version":1,"secrets":["COPILOT_GITHUB_TOKEN","GH_AW_CI_TRIGGER_TOKEN","GH_AW_GITHUB_MCP_SERVER_TOKEN","GH_AW_GITHUB_TOKEN","GITHUB_TOKEN"],"actions":[{"repo":"actions/checkout","sha":"de0fac2e4500dabe0009e67214ff5f5447ce83dd","version":"v6.0.2"},{"repo":"actions/download-artifact","sha":"3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c","version":"v8.0.1"},{"repo":"actions/github-script","sha":"373c709c69115d41ff229c7e5df9f8788daa9553","version":"v9"},{"repo":"actions/setup-node","sha":"6044e13b5dc448c55e2357c09f80417699197238","version":"6044e13b5dc448c55e2357c09f80417699197238"},{"repo":"actions/upload-artifact","sha":"043fb46d1a93c77aae656e7c1c64a875d1fc6a0a","version":"v7.0.1"},{"repo":"github/gh-aw-actions/setup","sha":"006ffd856b868b71df342dbe0ba082a963249b31","version":"v0.69.3"}],"containers":[{"image":"alpine:latest"},{"image":"ghcr.io/github/gh-aw-firewall/agent:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/api-proxy:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/squid:0.25.26"},{"image":"ghcr.io/github/gh-aw-mcpg:v0.2.26"},{"image":"ghcr.io/github/github-mcp-server:v1.0.0"},{"image":"node:25-alpine"},{"image":"node:lts-alpine"}]} # ___ _ _ # / _ \ | | (_) @@ -204,26 +204,24 @@ jobs: GH_AW_GITHUB_REPOSITORY: ${{ github.repository }} GH_AW_GITHUB_RUN_ID: ${{ github.run_id }} GH_AW_GITHUB_WORKSPACE: ${{ github.workspace }} - GH_AW_WIKI_NOTE: ${{ '' }} # poutine:ignore untrusted_checkout_exec run: | bash "${RUNNER_TEMP}/gh-aw/actions/create_prompt_first.sh" { - cat << 'GH_AW_PROMPT_e024284526b1041e_EOF' + cat << 'GH_AW_PROMPT_95df5d12dce6a208_EOF' - GH_AW_PROMPT_e024284526b1041e_EOF + GH_AW_PROMPT_95df5d12dce6a208_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/xpia.md" cat "${RUNNER_TEMP}/gh-aw/prompts/temp_folder_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/markdown.md" cat "${RUNNER_TEMP}/gh-aw/prompts/agentic_workflows_guide.md" - cat "${RUNNER_TEMP}/gh-aw/prompts/repo_memory_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/safe_outputs_prompt.md" - cat << 'GH_AW_PROMPT_e024284526b1041e_EOF' + cat << 'GH_AW_PROMPT_95df5d12dce6a208_EOF' Tools: add_comment, create_pull_request, dispatch_workflow, missing_tool, missing_data, noop - GH_AW_PROMPT_e024284526b1041e_EOF + GH_AW_PROMPT_95df5d12dce6a208_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/safe_outputs_create_pull_request.md" - cat << 'GH_AW_PROMPT_e024284526b1041e_EOF' + cat << 'GH_AW_PROMPT_95df5d12dce6a208_EOF' The following GitHub context information is available for this workflow: @@ -253,9 +251,9 @@ jobs: {{/if}} - GH_AW_PROMPT_e024284526b1041e_EOF + GH_AW_PROMPT_95df5d12dce6a208_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/github_mcp_tools_with_safeoutputs_prompt.md" - cat << 'GH_AW_PROMPT_e024284526b1041e_EOF' + cat << 'GH_AW_PROMPT_95df5d12dce6a208_EOF' {{#runtime-import .github/prompts/00-base-contract.md}} {{#runtime-import .github/prompts/01-bash-and-shell-safety.md}} @@ -267,7 +265,7 @@ jobs: {{#runtime-import .github/prompts/07-commit-and-pr.md}} {{#runtime-import .github/prompts/ext/tier-c-aggregation.md}} {{#runtime-import .github/workflows/news-week-ahead.md}} - GH_AW_PROMPT_e024284526b1041e_EOF + GH_AW_PROMPT_95df5d12dce6a208_EOF } > "$GH_AW_PROMPT" - name: Interpolate variables and render templates uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 @@ -291,12 +289,6 @@ jobs: GH_AW_GITHUB_REPOSITORY: ${{ github.repository }} GH_AW_GITHUB_RUN_ID: ${{ github.run_id }} GH_AW_GITHUB_WORKSPACE: ${{ github.workspace }} - GH_AW_MEMORY_BRANCH_NAME: 'memory/news-generation' - GH_AW_MEMORY_CONSTRAINTS: "\n\n**Constraints:**\n- **Max File Size**: 51200 bytes (0.05 MB) per file\n- **Max File Count**: 50 files per commit\n- **Max Patch Size**: 51200 bytes (50 KB) total per push (max: 100 KB)\n" - GH_AW_MEMORY_DESCRIPTION: '' - GH_AW_MEMORY_DIR: '/tmp/gh-aw/repo-memory/default/' - GH_AW_MEMORY_TARGET_REPO: ' of the current repository' - GH_AW_WIKI_NOTE: '' with: script: | const { setupGlobals } = require('${{ runner.temp }}/gh-aw/actions/setup_globals.cjs'); @@ -315,13 +307,7 @@ jobs: GH_AW_GITHUB_EVENT_PULL_REQUEST_NUMBER: process.env.GH_AW_GITHUB_EVENT_PULL_REQUEST_NUMBER, GH_AW_GITHUB_REPOSITORY: process.env.GH_AW_GITHUB_REPOSITORY, GH_AW_GITHUB_RUN_ID: process.env.GH_AW_GITHUB_RUN_ID, - GH_AW_GITHUB_WORKSPACE: process.env.GH_AW_GITHUB_WORKSPACE, - GH_AW_MEMORY_BRANCH_NAME: process.env.GH_AW_MEMORY_BRANCH_NAME, - GH_AW_MEMORY_CONSTRAINTS: process.env.GH_AW_MEMORY_CONSTRAINTS, - GH_AW_MEMORY_DESCRIPTION: process.env.GH_AW_MEMORY_DESCRIPTION, - GH_AW_MEMORY_DIR: process.env.GH_AW_MEMORY_DIR, - GH_AW_MEMORY_TARGET_REPO: process.env.GH_AW_MEMORY_TARGET_REPO, - GH_AW_WIKI_NOTE: process.env.GH_AW_WIKI_NOTE + GH_AW_GITHUB_WORKSPACE: process.env.GH_AW_GITHUB_WORKSPACE } }); - name: Validate prompt placeholders @@ -416,16 +402,6 @@ jobs: - name: Pre-flight external endpoint reachability check (runs before MCP Gateway) run: "echo \"🔍 Network Diagnostics — $(date -u '+%Y-%m-%dT%H:%M:%SZ')\"\necho \"═══════════════════════════════════════════\"\necho \"\"\necho \"📡 DNS Resolution Tests:\"\nfor domain in riksdag-regering-ai.onrender.com api.scb.se api.worldbank.org data.riksdagen.se www.riksdagen.se www.regeringen.se; do\n if nslookup \"$domain\" >/dev/null 2>&1; then\n IP=$(nslookup \"$domain\" 2>/dev/null | grep -A1 \"Name:\" | grep \"Address:\" | head -1 | awk '{print $2}')\n echo \" ✅ $domain → $IP\"\n else\n echo \" ❌ $domain — DNS FAILED\"\n fi\ndone\necho \"\"\necho \"🌐 HTTPS Connectivity Tests:\"\nfor url in \\\n \"https://riksdag-regering-ai.onrender.com/mcp\" \\\n \"https://api.scb.se/OV0104/v2beta\" \\\n \"https://api.worldbank.org/v2/country/SE?format=json\" \\\n \"https://data.riksdagen.se/dokumentlista/?sok=test&doktyp=bet&utformat=json&a=1\" \\\n; do\n HTTP_CODE=$(curl -s -o /dev/null -w \"%{http_code}\" --max-time 10 \"$url\" 2>/dev/null || echo \"000\")\n DOMAIN=$(echo \"$url\" | sed 's|https://||' | cut -d/ -f1)\n if [ \"$HTTP_CODE\" -ge 200 ] && [ \"$HTTP_CODE\" -lt 400 ]; then\n echo \" ✅ $DOMAIN → HTTP $HTTP_CODE\"\n elif [ \"$HTTP_CODE\" = \"000\" ]; then\n echo \" ❌ $DOMAIN → TIMEOUT/UNREACHABLE\"\n else\n echo \" ⚠️ $DOMAIN → HTTP $HTTP_CODE\"\n fi\ndone\necho \"\"\necho \"🔌 MCP Server Tool Count:\"\nTOOL_RESP=$(curl -sf --max-time 15 -X POST \\\n -H \"Content-Type: application/json\" \\\n -d '{\"jsonrpc\":\"2.0\",\"id\":1,\"method\":\"tools/list\",\"params\":{}}' \\\n \"https://riksdag-regering-ai.onrender.com/mcp\" 2>/dev/null) || TOOL_RESP=\"\"\nif echo \"$TOOL_RESP\" | grep -q '\"tools\"'; then\n TOOL_COUNT=$(echo \"$TOOL_RESP\" | grep -o '\"name\"' | wc -l)\n echo \" ✅ riksdag-regering MCP: $TOOL_COUNT tools registered\"\nelse\n echo \" ❌ riksdag-regering MCP: No tools response (server may still be starting)\"\nfi\necho \"\"\necho \"═══════════════════════════════════════════\"\n" - # Repo memory git-based storage configuration from frontmatter processed below - - name: Clone repo-memory branch (default) - env: - GH_TOKEN: ${{ github.token }} - GITHUB_SERVER_URL: ${{ github.server_url }} - BRANCH_NAME: memory/news-generation - TARGET_REPO: ${{ github.repository }} - MEMORY_DIR: /tmp/gh-aw/repo-memory/default - CREATE_ORPHAN: true - run: bash "${RUNNER_TEMP}/gh-aw/actions/clone_repo_memory_branch.sh" - name: Configure Git credentials env: REPO_NAME: ${{ github.repository }} @@ -500,9 +476,9 @@ jobs: mkdir -p "${RUNNER_TEMP}/gh-aw/safeoutputs" mkdir -p /tmp/gh-aw/safeoutputs mkdir -p /tmp/gh-aw/mcp-logs/safeoutputs - cat > "${RUNNER_TEMP}/gh-aw/safeoutputs/config.json" << 'GH_AW_SAFE_OUTPUTS_CONFIG_cefd10ddfb5e1cfc_EOF' - {"add_comment":{"max":1},"create_pull_request":{"draft":false,"expires":336,"labels":["agentic-news","analysis-data"],"max":1,"max_patch_size":4096,"protected_files":["package.json","bun.lockb","bunfig.toml","deno.json","deno.jsonc","deno.lock","global.json","NuGet.Config","Directory.Packages.props","mix.exs","mix.lock","go.mod","go.sum","stack.yaml","stack.yaml.lock","pom.xml","build.gradle","build.gradle.kts","settings.gradle","settings.gradle.kts","gradle.properties","package-lock.json","yarn.lock","pnpm-lock.yaml","npm-shrinkwrap.json","requirements.txt","Pipfile","Pipfile.lock","pyproject.toml","setup.py","setup.cfg","Gemfile","Gemfile.lock","uv.lock","CODEOWNERS","AGENTS.md","CLAUDE.md","GEMINI.md"],"protected_path_prefixes":[".github/",".agents/"]},"create_report_incomplete_issue":{},"dispatch_workflow":{"aw_context_workflows":["news-translate"],"max":1,"workflow_files":{"news-translate":".lock.yml"},"workflows":["news-translate"]},"missing_data":{},"missing_tool":{},"noop":{"max":1,"report-as-issue":"true"},"push_repo_memory":{"memories":[{"dir":"/tmp/gh-aw/repo-memory/default","id":"default","max_file_count":50,"max_file_size":51200,"max_patch_size":51200}]},"report_incomplete":{}} - GH_AW_SAFE_OUTPUTS_CONFIG_cefd10ddfb5e1cfc_EOF + cat > "${RUNNER_TEMP}/gh-aw/safeoutputs/config.json" << 'GH_AW_SAFE_OUTPUTS_CONFIG_d30fc87603f14a70_EOF' + {"add_comment":{"max":1},"create_pull_request":{"draft":false,"expires":336,"labels":["agentic-news","analysis-data"],"max":1,"max_patch_size":4096,"protected_files":["package.json","bun.lockb","bunfig.toml","deno.json","deno.jsonc","deno.lock","global.json","NuGet.Config","Directory.Packages.props","mix.exs","mix.lock","go.mod","go.sum","stack.yaml","stack.yaml.lock","pom.xml","build.gradle","build.gradle.kts","settings.gradle","settings.gradle.kts","gradle.properties","package-lock.json","yarn.lock","pnpm-lock.yaml","npm-shrinkwrap.json","requirements.txt","Pipfile","Pipfile.lock","pyproject.toml","setup.py","setup.cfg","Gemfile","Gemfile.lock","uv.lock","CODEOWNERS","AGENTS.md","CLAUDE.md","GEMINI.md"],"protected_path_prefixes":[".github/",".agents/"]},"create_report_incomplete_issue":{},"dispatch_workflow":{"aw_context_workflows":["news-translate"],"max":1,"workflow_files":{"news-translate":".lock.yml"},"workflows":["news-translate"]},"missing_data":{},"missing_tool":{},"noop":{"max":1,"report-as-issue":"true"},"report_incomplete":{}} + GH_AW_SAFE_OUTPUTS_CONFIG_d30fc87603f14a70_EOF - name: Write Safe Outputs Tools env: GH_AW_TOOLS_META_JSON: | @@ -773,7 +749,7 @@ jobs: mkdir -p /home/runner/.copilot GH_AW_NODE=$(which node 2>/dev/null || command -v node 2>/dev/null || echo node) - cat << GH_AW_MCP_CONFIG_3bb6e7417a2d49cd_EOF | "$GH_AW_NODE" "${RUNNER_TEMP}/gh-aw/actions/start_mcp_gateway.cjs" + cat << GH_AW_MCP_CONFIG_3dfa7881a7763865_EOF | "$GH_AW_NODE" "${RUNNER_TEMP}/gh-aw/actions/start_mcp_gateway.cjs" { "mcpServers": { "agenticworkflows": { @@ -886,10 +862,11 @@ jobs: "port": $MCP_GATEWAY_PORT, "domain": "${MCP_GATEWAY_DOMAIN}", "apiKey": "${MCP_GATEWAY_API_KEY}", - "payloadDir": "${MCP_GATEWAY_PAYLOAD_DIR}" + "payloadDir": "${MCP_GATEWAY_PAYLOAD_DIR}", + "keepaliveInterval": 300 } } - GH_AW_MCP_CONFIG_3bb6e7417a2d49cd_EOF + GH_AW_MCP_CONFIG_3dfa7881a7763865_EOF - name: Download activation artifact uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1 with: @@ -907,7 +884,7 @@ jobs: - name: Execute GitHub Copilot CLI id: agentic_execution # Copilot CLI tool arguments (sorted): - timeout-minutes: 60 + timeout-minutes: 55 run: | set -o pipefail touch /tmp/gh-aw/agent-step-summary.md @@ -1065,15 +1042,6 @@ jobs: if [ ! -f /tmp/gh-aw/agent_output.json ]; then echo '{"items":[]}' > /tmp/gh-aw/agent_output.json fi - # Upload repo memory as artifacts for push job - - name: Upload repo-memory artifact (default) - if: always() - uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1 - with: - name: repo-memory-default - path: /tmp/gh-aw/repo-memory/default - retention-days: 1 - if-no-files-found: ignore - name: Upload agent artifacts if: always() continue-on-error: true @@ -1102,7 +1070,6 @@ jobs: - activation - agent - detection - - push_repo_memory - safe_outputs if: > always() && (needs.agent.result != 'skipped' || needs.activation.outputs.lockdown_check_failed == 'true' || @@ -1227,13 +1194,9 @@ jobs: GH_AW_CODE_PUSH_FAILURE_COUNT: ${{ needs.safe_outputs.outputs.code_push_failure_count }} GH_AW_LOCKDOWN_CHECK_FAILED: ${{ needs.activation.outputs.lockdown_check_failed }} GH_AW_STALE_LOCK_FILE_FAILED: ${{ needs.activation.outputs.stale_lock_file_failed }} - GH_AW_PUSH_REPO_MEMORY_RESULT: ${{ needs.push_repo_memory.result }} - GH_AW_REPO_MEMORY_VALIDATION_FAILED_default: ${{ needs.push_repo_memory.outputs.validation_failed_default }} - GH_AW_REPO_MEMORY_VALIDATION_ERROR_default: ${{ needs.push_repo_memory.outputs.validation_error_default }} - GH_AW_REPO_MEMORY_PATCH_SIZE_EXCEEDED_default: ${{ needs.push_repo_memory.outputs.patch_size_exceeded_default }} GH_AW_GROUP_REPORTS: "false" GH_AW_FAILURE_REPORT_AS_ISSUE: "true" - GH_AW_TIMEOUT_MINUTES: "60" + GH_AW_TIMEOUT_MINUTES: "55" with: github-token: ${{ secrets.GH_AW_GITHUB_TOKEN || secrets.GITHUB_TOKEN }} script: | @@ -1402,79 +1365,6 @@ jobs: const { main } = require('${{ runner.temp }}/gh-aw/actions/parse_threat_detection_results.cjs'); await main(); - push_repo_memory: - needs: - - activation - - agent - - detection - if: > - always() && (!cancelled()) && (needs.detection.result == 'success' || needs.detection.result == 'skipped') && - needs.agent.result != 'skipped' - runs-on: ubuntu-slim - permissions: - contents: write - concurrency: - group: "push-repo-memory-${{ github.repository }}|memory/news-generation" - cancel-in-progress: false - outputs: - patch_size_exceeded_default: ${{ steps.push_repo_memory_default.outputs.patch_size_exceeded }} - validation_error_default: ${{ steps.push_repo_memory_default.outputs.validation_error }} - validation_failed_default: ${{ steps.push_repo_memory_default.outputs.validation_failed }} - steps: - - name: Setup Scripts - id: setup - uses: github/gh-aw-actions/setup@006ffd856b868b71df342dbe0ba082a963249b31 # v0.69.3 - with: - destination: ${{ runner.temp }}/gh-aw/actions - job-name: ${{ github.job }} - trace-id: ${{ needs.activation.outputs.setup-trace-id }} - - name: Checkout repository - uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2 - with: - persist-credentials: false - sparse-checkout: . - - name: Configure Git credentials - env: - REPO_NAME: ${{ github.repository }} - SERVER_URL: ${{ github.server_url }} - GITHUB_TOKEN: ${{ github.token }} - run: | - git config --global user.email "github-actions[bot]@users.noreply.github.com" - git config --global user.name "github-actions[bot]" - git config --global am.keepcr true - # Re-authenticate git with GitHub token - SERVER_URL_STRIPPED="${SERVER_URL#https://}" - git remote set-url origin "https://x-access-token:${GITHUB_TOKEN}@${SERVER_URL_STRIPPED}/${REPO_NAME}.git" - echo "Git configured with standard GitHub Actions identity" - - name: Download repo-memory artifact (default) - uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1 - continue-on-error: true - with: - name: repo-memory-default - path: /tmp/gh-aw/repo-memory/default - - name: Push repo-memory changes (default) - id: push_repo_memory_default - if: always() - uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 - env: - GH_TOKEN: ${{ github.token }} - GITHUB_RUN_ID: ${{ github.run_id }} - GITHUB_SERVER_URL: ${{ github.server_url }} - ARTIFACT_DIR: /tmp/gh-aw/repo-memory/default - MEMORY_ID: default - TARGET_REPO: ${{ github.repository }} - BRANCH_NAME: memory/news-generation - MAX_FILE_SIZE: 51200 - MAX_FILE_COUNT: 50 - MAX_PATCH_SIZE: 51200 - ALLOWED_EXTENSIONS: '[".md",".json"]' - with: - script: | - const { setupGlobals } = require('${{ runner.temp }}/gh-aw/actions/setup_globals.cjs'); - setupGlobals(core, github, context, exec, io, getOctokit); - const { main } = require('${{ runner.temp }}/gh-aw/actions/push_repo_memory.cjs'); - await main(); - safe_outputs: needs: - activation diff --git a/.github/workflows/news-week-ahead.md b/.github/workflows/news-week-ahead.md index 7e369a695..44a92051e 100644 --- a/.github/workflows/news-week-ahead.md +++ b/.github/workflows/news-week-ahead.md @@ -41,12 +41,21 @@ permissions: discussions: read security-events: read -timeout-minutes: 60 +timeout-minutes: 55 concurrency: group: gh-aw-news-week-ahead-${{ inputs.article_date || 'today' }} cancel-in-progress: false +features: + mcp-gateway: true + +sandbox: + agent: awf + mcp: + port: 8080 + keepalive-interval: 300 # 5m ping to keep MCP connections alive; Copilot API token expires ~60min so PR must be created within 25min of agent start + runtimes: node: version: "25" @@ -98,12 +107,6 @@ tools: - all agentic-workflows: true bash: true - repo-memory: - branch-name: memory/news-generation - allowed-extensions: [".md", ".json"] - max-file-size: 51200 - max-file-count: 50 - max-patch-size: 51200 safe-outputs: allowed-domains: @@ -233,38 +236,47 @@ This workflow imports `../prompts/ext/tier-c-aggregation.md`. Produce **all 14 a - **Article type**: `week-ahead` - **Analysis subfolder**: `analysis/daily/$ARTICLE_DATE/week-ahead/` - **Core languages produced**: `en`, `sv` (remaining 12 languages dispatched to `news-translate`) -- **One pull request per run** containing analysis + articles + visualisation data. +- **Two-run model**: Run 1 produces an `analysis-only` PR (14 artifacts); Run 2 (next scheduled run, same day) detects existing analysis and produces an articles PR. + +## Time budget + +**Run 1 — Analysis mode** (no prior analysis found, ~45 min): + +| Minutes | Phase | Module | +|---------|-------|--------| +| 0–2 | MCP pre-warm + pre-flight analysis check | 02 / 03 | +| 2–7 | Download data + catalogue | 03 | +| 7–27 | Analysis Pass 1 (methodology read + per-doc analyses + 14 artifacts incl. 5 Tier-C) | 04 / ext | +| 27–38 | Analysis Pass 2 (read-back + improvements) | 04 | +| 38–41 | Analysis Gate (Tier-C extended gate) | 05 | +| 41–45 | Stage analysis, commit, **ONE** `safeoutputs___create_pull_request` (analysis-only) | 07 | -## Time budget (60 min, minimum 45 min of real work) +**Run 2 — Article mode** (analysis exists on disk, ~25 min): | Minutes | Phase | Module | |---------|-------|--------| -| 0–2 | MCP pre-warm + `get_sync_status` | 02 | -| 2–6 | Download data + catalogue | 03 | -| 6–25 | Analysis Pass 1 (methodology read + per-doc analyses + 9 artifacts) | 04 | -| 25–35 | Analysis Pass 2 (read-back + improvements) | 04 | -| 35–37 | Analysis Gate | 05 | -| 37–48 | Article Pass 1 + Pass 2 (EN, SV) | 06 | -| 48–55 | Visual + link validation | 06 | -| 55–60 | Stage, commit, **ONE** `safeoutputs___create_pull_request` | 07 | +| 0–2 | MCP pre-warm + pre-flight check (SKIP_ANALYSIS=true) | 02 / 03 | +| 2–5 | Read all 14 analysis artifacts into context | 06 | +| 5–18 | Article Pass 1 + Pass 2 (EN, SV) | 06 | +| 18–22 | Visual + link validation | 06 | +| 22–25 | Stage articles, commit, **ONE** `safeoutputs___create_pull_request` | 07 | -Trim scope before quality. Never open a second PR to "save" partial work — there is no second PR. +Trim scope before quality. Never open a second PR within a run — there is no second PR. ## Inputs - `article_date` — override date (defaults to today) -- `force_generation` — regenerate even if today's article exists (analysis is always refreshed regardless) +- `force_generation` — regenerate even if today's article exists; also forces analysis re-run - `languages` — core content languages (default `en,sv`) - `analysis_depth` — `standard` | `deep` (default) | `comprehensive` -## Dedup & analysis-only path +## Run-mode selection -If articles for `$ARTICLE_DATE` + `week-ahead` already exist **and** `force_generation=false`: +At the start of every run, the pre-flight check in `03-data-download.md` detects whether `analysis/daily/$ARTICLE_DATE/week-ahead/` already contains all 9 core artifacts: -- Still run the full analysis pipeline (modules 03 → 04 → 05). -- Commit the analysis. -- Open the single PR with title `📊 Analysis Only — Week Ahead — $ARTICLE_DATE` and label `analysis-only`. +- **No analysis found** → Analysis mode: download data, run Pass 1 + Pass 2 + Tier-C Gate (14 artifacts), commit analysis artifacts, open `analysis-only` PR, stop. +- **Analysis found** → Article mode: read existing analysis, generate articles, commit articles, open articles PR + dispatch `news-translate`. -Analysis is the primary product — a run never "does nothing" just because articles exist. +Repeated runs for the same `$ARTICLE_DATE` always use the same analysis folder when `force_generation=false`. Analysis is the primary product — a run never produces nothing. All other rules (bash format, AWF shell safety, MCP access, download pipeline, analysis methodology & gate, article generation, commit & PR policy) live in the imported modules. diff --git a/.github/workflows/news-weekly-review.lock.yml b/.github/workflows/news-weekly-review.lock.yml index eea103996..c39f957c2 100644 --- a/.github/workflows/news-weekly-review.lock.yml +++ b/.github/workflows/news-weekly-review.lock.yml @@ -1,4 +1,4 @@ -# gh-aw-metadata: {"schema_version":"v3","frontmatter_hash":"78ea852ee56c1208c9a62365f934f9f5bd84002030f83d834ccea70d568ded88","compiler_version":"v0.69.3","agent_id":"copilot","agent_model":"claude-opus-4.7"} +# gh-aw-metadata: {"schema_version":"v3","frontmatter_hash":"b0d080816e321919f8e4f40acb27c56d2bcf89d43b356e39045ecf2737a35f68","compiler_version":"v0.69.3","agent_id":"copilot","agent_model":"claude-opus-4.7"} # gh-aw-manifest: {"version":1,"secrets":["COPILOT_GITHUB_TOKEN","GH_AW_CI_TRIGGER_TOKEN","GH_AW_GITHUB_MCP_SERVER_TOKEN","GH_AW_GITHUB_TOKEN","GITHUB_TOKEN"],"actions":[{"repo":"actions/checkout","sha":"de0fac2e4500dabe0009e67214ff5f5447ce83dd","version":"v6.0.2"},{"repo":"actions/download-artifact","sha":"3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c","version":"v8.0.1"},{"repo":"actions/github-script","sha":"373c709c69115d41ff229c7e5df9f8788daa9553","version":"v9"},{"repo":"actions/setup-node","sha":"6044e13b5dc448c55e2357c09f80417699197238","version":"6044e13b5dc448c55e2357c09f80417699197238"},{"repo":"actions/upload-artifact","sha":"043fb46d1a93c77aae656e7c1c64a875d1fc6a0a","version":"v7.0.1"},{"repo":"github/gh-aw-actions/setup","sha":"006ffd856b868b71df342dbe0ba082a963249b31","version":"v0.69.3"}],"containers":[{"image":"alpine:latest"},{"image":"ghcr.io/github/gh-aw-firewall/agent:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/api-proxy:0.25.26"},{"image":"ghcr.io/github/gh-aw-firewall/squid:0.25.26"},{"image":"ghcr.io/github/gh-aw-mcpg:v0.2.26"},{"image":"ghcr.io/github/github-mcp-server:v1.0.0"},{"image":"node:25-alpine"},{"image":"node:lts-alpine"}]} # ___ _ _ # / _ \ | | (_) @@ -204,26 +204,24 @@ jobs: GH_AW_GITHUB_REPOSITORY: ${{ github.repository }} GH_AW_GITHUB_RUN_ID: ${{ github.run_id }} GH_AW_GITHUB_WORKSPACE: ${{ github.workspace }} - GH_AW_WIKI_NOTE: ${{ '' }} # poutine:ignore untrusted_checkout_exec run: | bash "${RUNNER_TEMP}/gh-aw/actions/create_prompt_first.sh" { - cat << 'GH_AW_PROMPT_e4ccc1c76dd1cbd7_EOF' + cat << 'GH_AW_PROMPT_0d77e55e01232230_EOF' - GH_AW_PROMPT_e4ccc1c76dd1cbd7_EOF + GH_AW_PROMPT_0d77e55e01232230_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/xpia.md" cat "${RUNNER_TEMP}/gh-aw/prompts/temp_folder_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/markdown.md" cat "${RUNNER_TEMP}/gh-aw/prompts/agentic_workflows_guide.md" - cat "${RUNNER_TEMP}/gh-aw/prompts/repo_memory_prompt.md" cat "${RUNNER_TEMP}/gh-aw/prompts/safe_outputs_prompt.md" - cat << 'GH_AW_PROMPT_e4ccc1c76dd1cbd7_EOF' + cat << 'GH_AW_PROMPT_0d77e55e01232230_EOF' Tools: add_comment, create_pull_request, dispatch_workflow, missing_tool, missing_data, noop - GH_AW_PROMPT_e4ccc1c76dd1cbd7_EOF + GH_AW_PROMPT_0d77e55e01232230_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/safe_outputs_create_pull_request.md" - cat << 'GH_AW_PROMPT_e4ccc1c76dd1cbd7_EOF' + cat << 'GH_AW_PROMPT_0d77e55e01232230_EOF' The following GitHub context information is available for this workflow: @@ -253,9 +251,9 @@ jobs: {{/if}} - GH_AW_PROMPT_e4ccc1c76dd1cbd7_EOF + GH_AW_PROMPT_0d77e55e01232230_EOF cat "${RUNNER_TEMP}/gh-aw/prompts/github_mcp_tools_with_safeoutputs_prompt.md" - cat << 'GH_AW_PROMPT_e4ccc1c76dd1cbd7_EOF' + cat << 'GH_AW_PROMPT_0d77e55e01232230_EOF' {{#runtime-import .github/prompts/00-base-contract.md}} {{#runtime-import .github/prompts/01-bash-and-shell-safety.md}} @@ -267,7 +265,7 @@ jobs: {{#runtime-import .github/prompts/07-commit-and-pr.md}} {{#runtime-import .github/prompts/ext/tier-c-aggregation.md}} {{#runtime-import .github/workflows/news-weekly-review.md}} - GH_AW_PROMPT_e4ccc1c76dd1cbd7_EOF + GH_AW_PROMPT_0d77e55e01232230_EOF } > "$GH_AW_PROMPT" - name: Interpolate variables and render templates uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 @@ -291,12 +289,6 @@ jobs: GH_AW_GITHUB_REPOSITORY: ${{ github.repository }} GH_AW_GITHUB_RUN_ID: ${{ github.run_id }} GH_AW_GITHUB_WORKSPACE: ${{ github.workspace }} - GH_AW_MEMORY_BRANCH_NAME: 'memory/news-generation' - GH_AW_MEMORY_CONSTRAINTS: "\n\n**Constraints:**\n- **Max File Size**: 51200 bytes (0.05 MB) per file\n- **Max File Count**: 50 files per commit\n- **Max Patch Size**: 51200 bytes (50 KB) total per push (max: 100 KB)\n" - GH_AW_MEMORY_DESCRIPTION: '' - GH_AW_MEMORY_DIR: '/tmp/gh-aw/repo-memory/default/' - GH_AW_MEMORY_TARGET_REPO: ' of the current repository' - GH_AW_WIKI_NOTE: '' with: script: | const { setupGlobals } = require('${{ runner.temp }}/gh-aw/actions/setup_globals.cjs'); @@ -315,13 +307,7 @@ jobs: GH_AW_GITHUB_EVENT_PULL_REQUEST_NUMBER: process.env.GH_AW_GITHUB_EVENT_PULL_REQUEST_NUMBER, GH_AW_GITHUB_REPOSITORY: process.env.GH_AW_GITHUB_REPOSITORY, GH_AW_GITHUB_RUN_ID: process.env.GH_AW_GITHUB_RUN_ID, - GH_AW_GITHUB_WORKSPACE: process.env.GH_AW_GITHUB_WORKSPACE, - GH_AW_MEMORY_BRANCH_NAME: process.env.GH_AW_MEMORY_BRANCH_NAME, - GH_AW_MEMORY_CONSTRAINTS: process.env.GH_AW_MEMORY_CONSTRAINTS, - GH_AW_MEMORY_DESCRIPTION: process.env.GH_AW_MEMORY_DESCRIPTION, - GH_AW_MEMORY_DIR: process.env.GH_AW_MEMORY_DIR, - GH_AW_MEMORY_TARGET_REPO: process.env.GH_AW_MEMORY_TARGET_REPO, - GH_AW_WIKI_NOTE: process.env.GH_AW_WIKI_NOTE + GH_AW_GITHUB_WORKSPACE: process.env.GH_AW_GITHUB_WORKSPACE } }); - name: Validate prompt placeholders @@ -416,16 +402,6 @@ jobs: - name: Pre-flight external endpoint reachability check (runs before MCP Gateway) run: "echo \"🔍 Network Diagnostics — $(date -u '+%Y-%m-%dT%H:%M:%SZ')\"\necho \"═══════════════════════════════════════════\"\necho \"\"\necho \"📡 DNS Resolution Tests:\"\nfor domain in riksdag-regering-ai.onrender.com api.scb.se api.worldbank.org data.riksdagen.se www.riksdagen.se www.regeringen.se; do\n if nslookup \"$domain\" >/dev/null 2>&1; then\n IP=$(nslookup \"$domain\" 2>/dev/null | grep -A1 \"Name:\" | grep \"Address:\" | head -1 | awk '{print $2}')\n echo \" ✅ $domain → $IP\"\n else\n echo \" ❌ $domain — DNS FAILED\"\n fi\ndone\necho \"\"\necho \"🌐 HTTPS Connectivity Tests:\"\nfor url in \\\n \"https://riksdag-regering-ai.onrender.com/mcp\" \\\n \"https://api.scb.se/OV0104/v2beta\" \\\n \"https://api.worldbank.org/v2/country/SE?format=json\" \\\n \"https://data.riksdagen.se/dokumentlista/?sok=test&doktyp=bet&utformat=json&a=1\" \\\n; do\n HTTP_CODE=$(curl -s -o /dev/null -w \"%{http_code}\" --max-time 10 \"$url\" 2>/dev/null || echo \"000\")\n DOMAIN=$(echo \"$url\" | sed 's|https://||' | cut -d/ -f1)\n if [ \"$HTTP_CODE\" -ge 200 ] && [ \"$HTTP_CODE\" -lt 400 ]; then\n echo \" ✅ $DOMAIN → HTTP $HTTP_CODE\"\n elif [ \"$HTTP_CODE\" = \"000\" ]; then\n echo \" ❌ $DOMAIN → TIMEOUT/UNREACHABLE\"\n else\n echo \" ⚠️ $DOMAIN → HTTP $HTTP_CODE\"\n fi\ndone\necho \"\"\necho \"🔌 MCP Server Tool Count:\"\nTOOL_RESP=$(curl -sf --max-time 15 -X POST \\\n -H \"Content-Type: application/json\" \\\n -d '{\"jsonrpc\":\"2.0\",\"id\":1,\"method\":\"tools/list\",\"params\":{}}' \\\n \"https://riksdag-regering-ai.onrender.com/mcp\" 2>/dev/null) || TOOL_RESP=\"\"\nif echo \"$TOOL_RESP\" | grep -q '\"tools\"'; then\n TOOL_COUNT=$(echo \"$TOOL_RESP\" | grep -o '\"name\"' | wc -l)\n echo \" ✅ riksdag-regering MCP: $TOOL_COUNT tools registered\"\nelse\n echo \" ❌ riksdag-regering MCP: No tools response (server may still be starting)\"\nfi\necho \"\"\necho \"═══════════════════════════════════════════\"\n" - # Repo memory git-based storage configuration from frontmatter processed below - - name: Clone repo-memory branch (default) - env: - GH_TOKEN: ${{ github.token }} - GITHUB_SERVER_URL: ${{ github.server_url }} - BRANCH_NAME: memory/news-generation - TARGET_REPO: ${{ github.repository }} - MEMORY_DIR: /tmp/gh-aw/repo-memory/default - CREATE_ORPHAN: true - run: bash "${RUNNER_TEMP}/gh-aw/actions/clone_repo_memory_branch.sh" - name: Configure Git credentials env: REPO_NAME: ${{ github.repository }} @@ -500,9 +476,9 @@ jobs: mkdir -p "${RUNNER_TEMP}/gh-aw/safeoutputs" mkdir -p /tmp/gh-aw/safeoutputs mkdir -p /tmp/gh-aw/mcp-logs/safeoutputs - cat > "${RUNNER_TEMP}/gh-aw/safeoutputs/config.json" << 'GH_AW_SAFE_OUTPUTS_CONFIG_dba07803c934c7ab_EOF' - {"add_comment":{"max":1},"create_pull_request":{"draft":false,"expires":336,"labels":["agentic-news","analysis-data"],"max":1,"max_patch_size":4096,"protected_files":["package.json","bun.lockb","bunfig.toml","deno.json","deno.jsonc","deno.lock","global.json","NuGet.Config","Directory.Packages.props","mix.exs","mix.lock","go.mod","go.sum","stack.yaml","stack.yaml.lock","pom.xml","build.gradle","build.gradle.kts","settings.gradle","settings.gradle.kts","gradle.properties","package-lock.json","yarn.lock","pnpm-lock.yaml","npm-shrinkwrap.json","requirements.txt","Pipfile","Pipfile.lock","pyproject.toml","setup.py","setup.cfg","Gemfile","Gemfile.lock","uv.lock","CODEOWNERS","AGENTS.md","CLAUDE.md","GEMINI.md"],"protected_path_prefixes":[".github/",".agents/"]},"create_report_incomplete_issue":{},"dispatch_workflow":{"aw_context_workflows":["news-translate"],"max":1,"workflow_files":{"news-translate":".lock.yml"},"workflows":["news-translate"]},"missing_data":{},"missing_tool":{},"noop":{"max":1,"report-as-issue":"true"},"push_repo_memory":{"memories":[{"dir":"/tmp/gh-aw/repo-memory/default","id":"default","max_file_count":50,"max_file_size":51200,"max_patch_size":51200}]},"report_incomplete":{}} - GH_AW_SAFE_OUTPUTS_CONFIG_dba07803c934c7ab_EOF + cat > "${RUNNER_TEMP}/gh-aw/safeoutputs/config.json" << 'GH_AW_SAFE_OUTPUTS_CONFIG_d38bccc0af0d1d20_EOF' + {"add_comment":{"max":1},"create_pull_request":{"draft":false,"expires":336,"labels":["agentic-news","analysis-data"],"max":1,"max_patch_size":4096,"protected_files":["package.json","bun.lockb","bunfig.toml","deno.json","deno.jsonc","deno.lock","global.json","NuGet.Config","Directory.Packages.props","mix.exs","mix.lock","go.mod","go.sum","stack.yaml","stack.yaml.lock","pom.xml","build.gradle","build.gradle.kts","settings.gradle","settings.gradle.kts","gradle.properties","package-lock.json","yarn.lock","pnpm-lock.yaml","npm-shrinkwrap.json","requirements.txt","Pipfile","Pipfile.lock","pyproject.toml","setup.py","setup.cfg","Gemfile","Gemfile.lock","uv.lock","CODEOWNERS","AGENTS.md","CLAUDE.md","GEMINI.md"],"protected_path_prefixes":[".github/",".agents/"]},"create_report_incomplete_issue":{},"dispatch_workflow":{"aw_context_workflows":["news-translate"],"max":1,"workflow_files":{"news-translate":".lock.yml"},"workflows":["news-translate"]},"missing_data":{},"missing_tool":{},"noop":{"max":1,"report-as-issue":"true"},"report_incomplete":{}} + GH_AW_SAFE_OUTPUTS_CONFIG_d38bccc0af0d1d20_EOF - name: Write Safe Outputs Tools env: GH_AW_TOOLS_META_JSON: | @@ -773,7 +749,7 @@ jobs: mkdir -p /home/runner/.copilot GH_AW_NODE=$(which node 2>/dev/null || command -v node 2>/dev/null || echo node) - cat << GH_AW_MCP_CONFIG_a25c83c6cb6ff8b2_EOF | "$GH_AW_NODE" "${RUNNER_TEMP}/gh-aw/actions/start_mcp_gateway.cjs" + cat << GH_AW_MCP_CONFIG_57a6e9819fe99cd4_EOF | "$GH_AW_NODE" "${RUNNER_TEMP}/gh-aw/actions/start_mcp_gateway.cjs" { "mcpServers": { "agenticworkflows": { @@ -886,10 +862,11 @@ jobs: "port": $MCP_GATEWAY_PORT, "domain": "${MCP_GATEWAY_DOMAIN}", "apiKey": "${MCP_GATEWAY_API_KEY}", - "payloadDir": "${MCP_GATEWAY_PAYLOAD_DIR}" + "payloadDir": "${MCP_GATEWAY_PAYLOAD_DIR}", + "keepaliveInterval": 300 } } - GH_AW_MCP_CONFIG_a25c83c6cb6ff8b2_EOF + GH_AW_MCP_CONFIG_57a6e9819fe99cd4_EOF - name: Download activation artifact uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1 with: @@ -907,7 +884,7 @@ jobs: - name: Execute GitHub Copilot CLI id: agentic_execution # Copilot CLI tool arguments (sorted): - timeout-minutes: 60 + timeout-minutes: 55 run: | set -o pipefail touch /tmp/gh-aw/agent-step-summary.md @@ -1065,15 +1042,6 @@ jobs: if [ ! -f /tmp/gh-aw/agent_output.json ]; then echo '{"items":[]}' > /tmp/gh-aw/agent_output.json fi - # Upload repo memory as artifacts for push job - - name: Upload repo-memory artifact (default) - if: always() - uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1 - with: - name: repo-memory-default - path: /tmp/gh-aw/repo-memory/default - retention-days: 1 - if-no-files-found: ignore - name: Upload agent artifacts if: always() continue-on-error: true @@ -1102,7 +1070,6 @@ jobs: - activation - agent - detection - - push_repo_memory - safe_outputs if: > always() && (needs.agent.result != 'skipped' || needs.activation.outputs.lockdown_check_failed == 'true' || @@ -1227,13 +1194,9 @@ jobs: GH_AW_CODE_PUSH_FAILURE_COUNT: ${{ needs.safe_outputs.outputs.code_push_failure_count }} GH_AW_LOCKDOWN_CHECK_FAILED: ${{ needs.activation.outputs.lockdown_check_failed }} GH_AW_STALE_LOCK_FILE_FAILED: ${{ needs.activation.outputs.stale_lock_file_failed }} - GH_AW_PUSH_REPO_MEMORY_RESULT: ${{ needs.push_repo_memory.result }} - GH_AW_REPO_MEMORY_VALIDATION_FAILED_default: ${{ needs.push_repo_memory.outputs.validation_failed_default }} - GH_AW_REPO_MEMORY_VALIDATION_ERROR_default: ${{ needs.push_repo_memory.outputs.validation_error_default }} - GH_AW_REPO_MEMORY_PATCH_SIZE_EXCEEDED_default: ${{ needs.push_repo_memory.outputs.patch_size_exceeded_default }} GH_AW_GROUP_REPORTS: "false" GH_AW_FAILURE_REPORT_AS_ISSUE: "true" - GH_AW_TIMEOUT_MINUTES: "60" + GH_AW_TIMEOUT_MINUTES: "55" with: github-token: ${{ secrets.GH_AW_GITHUB_TOKEN || secrets.GITHUB_TOKEN }} script: | @@ -1402,79 +1365,6 @@ jobs: const { main } = require('${{ runner.temp }}/gh-aw/actions/parse_threat_detection_results.cjs'); await main(); - push_repo_memory: - needs: - - activation - - agent - - detection - if: > - always() && (!cancelled()) && (needs.detection.result == 'success' || needs.detection.result == 'skipped') && - needs.agent.result != 'skipped' - runs-on: ubuntu-slim - permissions: - contents: write - concurrency: - group: "push-repo-memory-${{ github.repository }}|memory/news-generation" - cancel-in-progress: false - outputs: - patch_size_exceeded_default: ${{ steps.push_repo_memory_default.outputs.patch_size_exceeded }} - validation_error_default: ${{ steps.push_repo_memory_default.outputs.validation_error }} - validation_failed_default: ${{ steps.push_repo_memory_default.outputs.validation_failed }} - steps: - - name: Setup Scripts - id: setup - uses: github/gh-aw-actions/setup@006ffd856b868b71df342dbe0ba082a963249b31 # v0.69.3 - with: - destination: ${{ runner.temp }}/gh-aw/actions - job-name: ${{ github.job }} - trace-id: ${{ needs.activation.outputs.setup-trace-id }} - - name: Checkout repository - uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2 - with: - persist-credentials: false - sparse-checkout: . - - name: Configure Git credentials - env: - REPO_NAME: ${{ github.repository }} - SERVER_URL: ${{ github.server_url }} - GITHUB_TOKEN: ${{ github.token }} - run: | - git config --global user.email "github-actions[bot]@users.noreply.github.com" - git config --global user.name "github-actions[bot]" - git config --global am.keepcr true - # Re-authenticate git with GitHub token - SERVER_URL_STRIPPED="${SERVER_URL#https://}" - git remote set-url origin "https://x-access-token:${GITHUB_TOKEN}@${SERVER_URL_STRIPPED}/${REPO_NAME}.git" - echo "Git configured with standard GitHub Actions identity" - - name: Download repo-memory artifact (default) - uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1 - continue-on-error: true - with: - name: repo-memory-default - path: /tmp/gh-aw/repo-memory/default - - name: Push repo-memory changes (default) - id: push_repo_memory_default - if: always() - uses: actions/github-script@373c709c69115d41ff229c7e5df9f8788daa9553 # v9 - env: - GH_TOKEN: ${{ github.token }} - GITHUB_RUN_ID: ${{ github.run_id }} - GITHUB_SERVER_URL: ${{ github.server_url }} - ARTIFACT_DIR: /tmp/gh-aw/repo-memory/default - MEMORY_ID: default - TARGET_REPO: ${{ github.repository }} - BRANCH_NAME: memory/news-generation - MAX_FILE_SIZE: 51200 - MAX_FILE_COUNT: 50 - MAX_PATCH_SIZE: 51200 - ALLOWED_EXTENSIONS: '[".md",".json"]' - with: - script: | - const { setupGlobals } = require('${{ runner.temp }}/gh-aw/actions/setup_globals.cjs'); - setupGlobals(core, github, context, exec, io, getOctokit); - const { main } = require('${{ runner.temp }}/gh-aw/actions/push_repo_memory.cjs'); - await main(); - safe_outputs: needs: - activation diff --git a/.github/workflows/news-weekly-review.md b/.github/workflows/news-weekly-review.md index 41e734f8c..67baeb5a0 100644 --- a/.github/workflows/news-weekly-review.md +++ b/.github/workflows/news-weekly-review.md @@ -41,12 +41,21 @@ permissions: discussions: read security-events: read -timeout-minutes: 60 +timeout-minutes: 55 concurrency: group: gh-aw-news-weekly-review-${{ inputs.article_date || 'today' }} cancel-in-progress: false +features: + mcp-gateway: true + +sandbox: + agent: awf + mcp: + port: 8080 + keepalive-interval: 300 # 5m ping to keep MCP connections alive; Copilot API token expires ~60min so PR must be created within 25min of agent start + runtimes: node: version: "25" @@ -98,12 +107,6 @@ tools: - all agentic-workflows: true bash: true - repo-memory: - branch-name: memory/news-generation - allowed-extensions: [".md", ".json"] - max-file-size: 51200 - max-file-count: 50 - max-patch-size: 51200 safe-outputs: allowed-domains: @@ -233,38 +236,47 @@ This workflow imports `../prompts/ext/tier-c-aggregation.md`. Produce **all 14 a - **Article type**: `weekly-review` - **Analysis subfolder**: `analysis/daily/$ARTICLE_DATE/weekly-review/` - **Core languages produced**: `en`, `sv` (remaining 12 languages dispatched to `news-translate`) -- **One pull request per run** containing analysis + articles + visualisation data. +- **Two-run model**: Run 1 produces an `analysis-only` PR (14 artifacts); Run 2 (next scheduled run, same day) detects existing analysis and produces an articles PR. + +## Time budget + +**Run 1 — Analysis mode** (no prior analysis found, ~45 min): + +| Minutes | Phase | Module | +|---------|-------|--------| +| 0–2 | MCP pre-warm + pre-flight analysis check | 02 / 03 | +| 2–7 | Download data + catalogue | 03 | +| 7–27 | Analysis Pass 1 (methodology read + per-doc analyses + 14 artifacts incl. 5 Tier-C) | 04 / ext | +| 27–38 | Analysis Pass 2 (read-back + improvements) | 04 | +| 38–41 | Analysis Gate (Tier-C extended gate) | 05 | +| 41–45 | Stage analysis, commit, **ONE** `safeoutputs___create_pull_request` (analysis-only) | 07 | -## Time budget (60 min, minimum 45 min of real work) +**Run 2 — Article mode** (analysis exists on disk, ~25 min): | Minutes | Phase | Module | |---------|-------|--------| -| 0–2 | MCP pre-warm + `get_sync_status` | 02 | -| 2–6 | Download data + catalogue | 03 | -| 6–25 | Analysis Pass 1 (methodology read + per-doc analyses + 9 artifacts) | 04 | -| 25–35 | Analysis Pass 2 (read-back + improvements) | 04 | -| 35–37 | Analysis Gate | 05 | -| 37–48 | Article Pass 1 + Pass 2 (EN, SV) | 06 | -| 48–55 | Visual + link validation | 06 | -| 55–60 | Stage, commit, **ONE** `safeoutputs___create_pull_request` | 07 | +| 0–2 | MCP pre-warm + pre-flight check (SKIP_ANALYSIS=true) | 02 / 03 | +| 2–5 | Read all 14 analysis artifacts into context | 06 | +| 5–18 | Article Pass 1 + Pass 2 (EN, SV) | 06 | +| 18–22 | Visual + link validation | 06 | +| 22–25 | Stage articles, commit, **ONE** `safeoutputs___create_pull_request` | 07 | -Trim scope before quality. Never open a second PR to "save" partial work — there is no second PR. +Trim scope before quality. Never open a second PR within a run — there is no second PR. ## Inputs - `article_date` — override date (defaults to today) -- `force_generation` — regenerate even if today's article exists (analysis is always refreshed regardless) +- `force_generation` — regenerate even if today's article exists; also forces analysis re-run - `languages` — core content languages (default `en,sv`) - `analysis_depth` — `standard` | `deep` (default) | `comprehensive` -## Dedup & analysis-only path +## Run-mode selection -If articles for `$ARTICLE_DATE` + `weekly-review` already exist **and** `force_generation=false`: +At the start of every run, the pre-flight check in `03-data-download.md` detects whether `analysis/daily/$ARTICLE_DATE/weekly-review/` already contains all 9 core artifacts: -- Still run the full analysis pipeline (modules 03 → 04 → 05). -- Commit the analysis. -- Open the single PR with title `📊 Analysis Only — Weekly Review — $ARTICLE_DATE` and label `analysis-only`. +- **No analysis found** → Analysis mode: download data, run Pass 1 + Pass 2 + Tier-C Gate (14 artifacts), commit analysis artifacts, open `analysis-only` PR, stop. +- **Analysis found** → Article mode: read existing analysis, generate articles, commit articles, open articles PR + dispatch `news-translate`. -Analysis is the primary product — a run never "does nothing" just because articles exist. +Repeated runs for the same `$ARTICLE_DATE` always use the same analysis folder when `force_generation=false`. Analysis is the primary product — a run never produces nothing. All other rules (bash format, AWF shell safety, MCP access, download pipeline, analysis methodology & gate, article generation, commit & PR policy) live in the imported modules. diff --git a/tests/workflow-architecture.test.ts b/tests/workflow-architecture.test.ts index caac71a1c..7d7f12e1a 100644 --- a/tests/workflow-architecture.test.ts +++ b/tests/workflow-architecture.test.ts @@ -1277,7 +1277,7 @@ describe('Workflow timeout limits', () => { 'news-translate.md', ]; - it('no workflow should exceed 60-minute timeout', () => { + it('no workflow should exceed 55-minute timeout', () => { for (const workflowFile of ALL_NEWS_WORKFLOWS) { const filepath = path.join(WORKFLOWS_DIR, workflowFile); if (!fs.existsSync(filepath)) continue; @@ -1288,8 +1288,8 @@ describe('Workflow timeout limits', () => { const timeout = parseInt(timeoutMatch[1]!, 10); expect( timeout, - `Workflow ${workflowFile} has timeout-minutes: ${timeout} which exceeds 60 minutes` - ).toBeLessThanOrEqual(60); + `Workflow ${workflowFile} has timeout-minutes: ${timeout} which exceeds 55 minutes (hard ceiling to keep runs within the ~60-minute Copilot API token window; see 00-base-contract.md §Session keepalive requirement)` + ).toBeLessThanOrEqual(55); } } });