From 9c8a0dc1223e886e3df791593ae560f83255d7e3 Mon Sep 17 00:00:00 2001 From: liplus-lin-lay Date: Thu, 19 Mar 2026 21:28:16 +0900 Subject: [PATCH 1/5] docs(notifications): add notifications layer spec MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Notifications layer を新しい numbered requirements spec として追加し、Home / Model / Operations / Adapter の通知記述をその正本へ接続した。 共有 webhook queue で前景一致通知だけを扱い、claim と cleanup を分離する設計を要求仕様として固定するための変更。 Refs #786 --- docs/1.-Model.md | 7 +- docs/3.-Operations.md | 2 + docs/4.-Adapter.md | 6 +- docs/5.-Notifications.md | 138 +++++++++++++++++++++++++++++++++++++++ docs/Home.md | 1 + 5 files changed, 150 insertions(+), 4 deletions(-) create mode 100644 docs/5.-Notifications.md diff --git a/docs/1.-Model.md b/docs/1.-Model.md index 0937ae1..8301f85 100644 --- a/docs/1.-Model.md +++ b/docs/1.-Model.md @@ -108,7 +108,7 @@ Li+ では、次の3つを別軸として扱う。 レイヤーごとに責務と再読込・再適用の条件を持つ。優先順は同一レイヤーの内部にだけ定義し、別レイヤー同士を勝敗関係として扱わない。 -4層構成。各プログラムファイルが自身のレイヤーを冒頭で宣言する。 +5層構成。各プログラムファイルが自身のレイヤーを冒頭で宣言する。 **Model Layer:** 不変原則、層内順序、対話面、挙動スタイル、タスクモード。Li+ program の基盤。他の全レイヤーがこれに依存する。 @@ -119,10 +119,13 @@ Li+ では、次の3つを別軸として扱う。 **Operations Layer:** ブランチ / コミット / 変更要求 / 検証 / マージ / リリースの手順。イベント駆動。毎セッション必須ではない。 +**Notifications Layer:** +GitHub Notifications API / webhook / local state fallback を横断する通知意味論。前景一致判定、claim/read/done、会話言及、janitor cleanup を定義する。 + **Adapter Layer:** ホスト注入、ランタイムトリガー、再読込配線、プラットフォーム固有バインディング。Li+ program をホスト環境へ接続する。 -接続チェーン: model → task → operations → adapter(依存順序のみ) +接続チェーン: model → task → operations → notifications → adapter(依存順序のみ) --- diff --git a/docs/3.-Operations.md b/docs/3.-Operations.md index 2f07815..b911303 100644 --- a/docs/3.-Operations.md +++ b/docs/3.-Operations.md @@ -327,6 +327,8 @@ DELETE /notifications/threads/{id} -> 204 完了(Inbox から除 scope = `notifications`(classic PAT) +通知 API の生操作はオペレーションレイヤーから参照してよい。ただし、前景関連性判定、`claim`、`ack/read`、`consume/done`、`mention`、`cleanup` の意味論は [5. Notifications](5.-Notifications) を正本とする。 + --- ## Discussions diff --git a/docs/4.-Adapter.md b/docs/4.-Adapter.md index 2008ad5..566f719 100644 --- a/docs/4.-Adapter.md +++ b/docs/4.-Adapter.md @@ -47,7 +47,7 @@ hook は Claude Code ランタイムが強制発火するため、AI の記憶 動作: 1. Character Instance 再通知:`.claude/CLAUDE.md` から Character Instance セクションを抽出して出力する。見つからない場合はスキップする。 -2. Webhook 通知チェック:bundled helper が存在し pending_count > 0 なら件数を出力する。それ以外は何もしない。 +2. 通知取り込み:利用可能な通知源を1回 `inspect` し、[5. Notifications](5.-Notifications) の規則で前景一致した summary だけをホストへ渡す。empty/no-op は出力しない。 ### post-tool-use.sh @@ -87,12 +87,14 @@ bootstrap は次回セッションから有効。現セッションは Li+config ホストが各ターン先頭でローカル確認を実行できる場合のみ使用する。確認処理は内部 housekeeping として無言で行い、確認中であることや empty/no-op 結果を会話へ出さない。 +アダプターが所有するのは transport の選択と summary の受け渡しである。関連性判定、`claim`、`ack/read`、`consume/done`、`mention`、`cleanup` の正本は [5. Notifications](5.-Notifications) に置く。 + 通知源の優先順位: 1. `mcp__github-webhook-mcp` 2. ローカル webhook ストア(`LI_PLUS_MODE=clone` かつ bundled helper が使える場合) 3. 利用不可 → 黙ってスキップ -pending が0件なら何も言わない。pending がある時だけ短く通知する。詳細が必要になるまでは full payload を開かない。このフローから別 AI プロセスを起動しない。 +アダプターは `inspect` を既定とし、前景一致しない通知を勝手に排水しない。詳細が必要になるまでは full payload を開かない。このフローから別 AI プロセスを起動しない。 --- diff --git a/docs/5.-Notifications.md b/docs/5.-Notifications.md new file mode 100644 index 0000000..1496c1b --- /dev/null +++ b/docs/5.-Notifications.md @@ -0,0 +1,138 @@ +# 通知レイヤー仕様書 + +本文書は Li+ プログラムの通知レイヤーの仕様を定義する。 +要求(何を満たすか)と仕様(どう振る舞うか)を一体として記述する。 + +通知レイヤーは、GitHub Notifications API、webhook、ローカル state dir、将来の受動受信をまたいで共通に使う意味論を定義する。polling か push かは transport の違いであり、通知の ownership と前景会話への出し方は本文書を正本とする。 + +--- + +## 目的 + +Li+ は前景セッションで軽量な通知差分を扱うが、共有 queue を雑に排水すると別 AI や別セッションの作業面を壊す。通知レイヤーは、通知の確認、所有、既読化、完了、会話への言及、清掃を分離し、前景スレッドの安定性を守る。 + +特に以下を満たす: + +- 前景セッションに関係する通知だけを選択して扱える +- 関係のない通知を勝手に消費しない +- 重要な前景外通知だけを例外的に会話へ出せる +- 明らかに無害で放置された stale 通知だけを清掃できる +- 将来 transport が受動受信へ変わっても上位意味論を変えない + +--- + +## 基本操作 + +通知レイヤーは次の操作を区別する。 + +| 操作 | 意味 | +|------|------| +| `inspect` | 未処理通知を読む。ownership は変えない | +| `claim` | 現在の前景セッションがその通知を担当対象として確保する | +| `ack/read` | 読んだことを記録する。Inbox から消すとは限らない | +| `consume/done` | 処理済みまたは意図的破棄として active queue から外す | +| `mention` | 現在の会話へ通知を出す | +| `cleanup` | 応答不要な stale 通知を janitor 的に片づける | + +`claim` は `ack/read` でも `consume/done` でもない。所有権の宣言であり、会話への言及や queue からの除去とは別に扱う。 + +--- + +## 前景セッションの既定 + +各ユーザーターンの先頭で、前景セッションは通知源を1回だけ `inspect` してよい。確認自体は内部 housekeeping であり、確認中であることや empty/no-op 結果を会話へ出さない。 + +前景一致判定は可能な限り機械的に行う。優先する手掛かりは以下: + +- 現在扱っている repository +- 現在の issue / PR / Discussion 番号 +- 現在の linked branch +- 進行中タスクと直接結び付く workflow / check / review 対象 + +既定動作は次のとおり: + +- 前景一致した通知だけを `claim` 対象にできる +- 前景一致した通知だけを `ack/read` または `consume/done` 候補にできる +- 関連性が安価に判定できない通知は `mention` せず、`consume/done` もしない +- 判断不能時の既定は「黙る・残す」に倒す + +--- + +## 例外的な会話言及 + +前景外通知でも、重要性が高い場合に限り `mention` を許可する。これは queue ownership とは別判断であり、`mention` したからといって自動で `consume/done` しない。 + +例外候補: + +- 外部人間からの Discussion / issue / PR コメント +- 現在の作業を止めうる failure / blocking / review request +- 自分たち以外からの明示的な呼びかけ + +曖昧な通知は例外扱いしない。 + +--- + +## マルチ AI 共有キュー + +Codex と Claude Code など複数の AI が同じ repository の通知 queue を共有する場合、一方の前景セッションが他方の通知を勝手に排水してはならない。 + +そのため、transport が許すなら通知 state は次を持てる形が望ましい: + +- `claimed_by` +- `claimed_at` +- `consumed_at` +- `reason` + +transport 自体に `claim` が無い場合は sidecar metadata で補う。どちらも使えない場合、破壊的な `consume/done` より preserve を優先する。 + +`drain all pending events` は暫定 helper 実装として存在しても、上位意味論の正本にはしない。 + +--- + +## 清掃(Janitor) + +前景一致しない通知でも、明らかに誰の応答も不要で、かつ一定時間以上放置された無害通知だけは `cleanup` してよい。 + +清掃候補: + +- 自分たちが発行した `check_run` / `workflow_run` の success +- 重複している自動生成通知 +- 後続イベントで意味を失った generated artifact + +清掃禁止: + +- 人間の comment / Discussion / review +- failure / blocking / changes requested +- ownership が曖昧な通知 +- relevance を安価に判定できない通知 + +清掃は relevance 判断の代替ではない。安全に無視できるものだけを対象にする。 + +--- + +## Transport Binding + +理想の意味論は GitHub Notifications API に寄せる。 + +| 意味論 | GitHub Notifications API | Webhook / local state fallback | +|--------|--------------------------|--------------------------------| +| `inspect` | `GET /notifications?all=false` | pending event の一覧取得 | +| `ack/read` | `PATCH /notifications/threads/{id}` | read 相当の state 更新 | +| `consume/done` | `DELETE /notifications/threads/{id}` | pending queue から除去 | +| `claim` | API 非対応。sidecar で補う | sidecar または state file で補う | + +transport は polling でも push でもよい。前景一致判定、例外的 `mention`、janitor `cleanup` の規則は transport によって変えない。 + +--- + +## 他レイヤーとの接続 + +**Operations Layer:** CI / review / release 待機で必要なイベント種別を定義するが、共有 queue の ownership と cleanup 規則は通知レイヤーを再定義しない。 + +**Adapter Layer:** 各ターン先頭で transport を `inspect` し、前景へ渡す summary を整える。関連性判断と destructive consume の正本は通知レイヤーに従う。 + +--- + +## 進化 + +再構築・削除・最適化はすべて許容する。構造の一貫性のみ維持する。 diff --git a/docs/Home.md b/docs/Home.md index 63a3b47..7ffa0f6 100755 --- a/docs/Home.md +++ b/docs/Home.md @@ -21,6 +21,7 @@ Li+ v1.0.0 の成立条件は到達済みとみなし、現在の本番はその | [2. Task](2.-Task) | タスクレイヤー仕様書 | | [3. Operations](3.-Operations) | オペレーションレイヤー仕様書 | | [4. Adapter](4.-Adapter) | アダプターレイヤー仕様書 | +| [5. Notifications](5.-Notifications) | 通知レイヤー仕様書 | --- From 72e7a943ba7022546e0028822d350d3326fd3d98 Mon Sep 17 00:00:00 2001 From: liplus-lin-lay Date: Thu, 19 Mar 2026 22:05:52 +0900 Subject: [PATCH 2/5] fix(notifications): replace drain-all foreground intake MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit foreground webhook intake を drain-all から inspect-first へ置き換え、foreground/notable/cleanup の分類と read/done/claim を helper 側へ分離した。 Claude Code hook と adapter/operations の記述も新しい helper の意味論に合わせて更新し、共有 queue を勝手に排水しない設計へ寄せた。 Refs #786 --- Li+agent.md | 8 +- Li+claude.md | 50 +- Li+operations.md | 9 +- scripts/check_webhook_notifications.py | 550 +++++++++++++++++++++- tests/test_check_webhook_notifications.py | 392 +++++++++++---- 5 files changed, 893 insertions(+), 116 deletions(-) diff --git a/Li+agent.md b/Li+agent.md index 7fa13c1..db574b0 100644 --- a/Li+agent.md +++ b/Li+agent.md @@ -66,7 +66,7 @@ Keep scope local: ## Optional Webhook Notification Flow -Adapter-side foreground intake only. Semantic policy remains in Li+operations.md. +Adapter-side foreground intake only. Semantic policy remains in Li+operations.md foreground intake rules. Use only in hosts that can run local commands from the workspace before replying. @@ -81,9 +81,9 @@ Use only in hosts that can run local commands from the workspace before replying 1. `LI_PLUS_WEBHOOK_STATE_DIR` from `Li+config.md` (absolute path or `workspace_root`-relative path) 2. `workspace_root/github-webhook-mcp` 3. `workspace_root/../github-webhook-mcp` - - if the bundled helper exists at `workspace_root/liplus-language/scripts/check_webhook_notifications.py` and the state dir resolves, run it with `--limit 5 --consume` + - if the bundled helper exists at `workspace_root/liplus-language/scripts/check_webhook_notifications.py` and the state dir resolves, run it in inspect mode (`--limit 5`) and pass cheap foreground hints such as repo / branch when available - else: skip silently. -3. Mention notifications only when new items exist. -4. If the local helper surfaces items, treat them as consumed immediately and delete related generated files. +3. Mention notifications only when foreground-matched items or exceptional notable items exist. +4. Do not auto-consume the local backlog from this foreground inspect path. `claim` / `read` / `done` / `cleanup` require explicit helper commands or a deeper workflow that owns the notification. 5. Do not launch a separate AI process for webhook replies from this foreground flow. 6. Do not open the full webhook payload unless deeper inspection is actually needed. diff --git a/Li+claude.md b/Li+claude.md index bfc3e6f..503453e 100644 --- a/Li+claude.md +++ b/Li+claude.md @@ -1,7 +1,7 @@ # Li+claude.md — Claude Code Hook Definitions Layer = Adapter Layer (Claude Code binding) -Semantic source = Li+agent.md trigger contract + Model Layer / Task Layer / Operations Layer. +Semantic source = Li+agent.md trigger contract + Model Layer / Task Layer / Operations Layer foreground intake rules. This file compiles adapter rules into Claude Code hooks. Bootstrap target: runtime=claude only. @@ -58,6 +58,12 @@ fi HELPER="$PROJECT_ROOT/liplus-language/scripts/check_webhook_notifications.py" [ -f "$HELPER" ] || exit 0 +repo_from_origin() { + git -C "$PROJECT_ROOT" remote get-url origin 2>/dev/null \ + | grep -oE '[^/@:]+/[^/]+$' \ + | sed 's/\.git$//' 2>/dev/null || echo "" +} + # Parse LI_PLUS_WEBHOOK_STATE_DIR from Li+config.md if set CONFIG_MD="$PROJECT_ROOT/Li+config.md" STATE_DIR_ARGS=() @@ -66,17 +72,49 @@ if [ -f "$CONFIG_MD" ]; then [ -n "$VAL" ] && STATE_DIR_ARGS=(--state-dir "$VAL") fi -RESULT=$(python3 "$HELPER" --workspace-root "$PROJECT_ROOT" "${STATE_DIR_ARGS[@]}" --limit 5 2>/dev/null) +CURRENT_REPO=$(repo_from_origin) +CURRENT_BRANCH=$(git -C "$PROJECT_ROOT" branch --show-current 2>/dev/null || echo "") + +HELPER_ARGS=( + --workspace-root "$PROJECT_ROOT" + "${STATE_DIR_ARGS[@]}" + --limit 5 + --internal-sender liplus-lin-lay + --internal-sender lipluscodex +) +[ -n "$CURRENT_REPO" ] && HELPER_ARGS+=(--repo "$CURRENT_REPO") +if [ -n "$CURRENT_BRANCH" ]; then + HELPER_ARGS+=(--branch "$CURRENT_BRANCH" --infer-numbers-from-branch) +fi + +RESULT=$(python3 "$HELPER" "${HELPER_ARGS[@]}" 2>/dev/null) [ -z "$RESULT" ] && exit 0 -PENDING=$(printf '%s' "$RESULT" | python3 -c "import sys,json; print(json.load(sys.stdin).get('pending_count',0))" 2>/dev/null) -if [ -z "$PENDING" ] || [ "$PENDING" = "0" ]; then +MENTION_COUNT=$(printf '%s' "$RESULT" | python3 -c "import sys,json; print(json.load(sys.stdin).get('mention_count',0))" 2>/dev/null) +if [ -z "$MENTION_COUNT" ] || [ "$MENTION_COUNT" = "0" ]; then exit 0 fi echo "" -echo "━━━ Webhook: $PENDING pending notification(s) ━━━" -echo "Run: check_webhook_notifications.py --consume or use mcp__github-webhook-mcp" +echo "━━━ Webhook: foreground/notable notification(s) ━━━" +printf '%s' "$RESULT" | python3 -c ' +import json +import sys + +data = json.load(sys.stdin) +seen = set() +for bucket, label in (("relevant_items", "foreground"), ("notable_items", "notable")): + for item in data.get(bucket, []): + event_id = item.get("id") + if event_id in seen: + continue + seen.add(event_id) + number = item.get("number") + title = item.get("title") or item.get("type") or "notification" + event_type = item.get("type") or "event" + prefix = f"#{number} " if number is not None else "" + print(f"[{label}] {event_type} {prefix}{title}") +' echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" ``` diff --git a/Li+operations.md b/Li+operations.md index 8169270..c4b3f54 100644 --- a/Li+operations.md +++ b/Li+operations.md @@ -300,13 +300,14 @@ Event-Driven Operations b = {workspace_root}/github-webhook-mcp c = {workspace_root}/../github-webhook-mcp if helper missing or state dir unresolved = skip silently - helper output = latest lightweight summaries only - on consume = drain all current pending events immediately and delete related generated files + helper output = inspect summary with foreground-matched items, notable items, and cleanup candidates + helper default = inspect only; preserve unmatched backlog + destructive actions = explicit `read` / `done` / `claim` / `cleanup-safe-success` calls only foreground handling: each user turn start = inspect once before main reply - pending_count == 0 = no mention - pending_count > 0 = brief mention before main reply + mention only = foreground-matched items or exceptional notable items + if relevance cannot be judged cheaply = preserve and stay silent full payload = open only when deeper inspection is needed separate AI process launch = prohibited for this flow diff --git a/scripts/check_webhook_notifications.py b/scripts/check_webhook_notifications.py index 88b1980..22d996f 100644 --- a/scripts/check_webhook_notifications.py +++ b/scripts/check_webhook_notifications.py @@ -1,15 +1,42 @@ #!/usr/bin/env python3 -"""Inspect lightweight pending GitHub webhook notifications from a local state dir.""" +"""Inspect lightweight GitHub webhook notifications from a local state dir.""" from __future__ import annotations import argparse import json import os +import re +from dataclasses import dataclass +from datetime import datetime, timedelta, timezone from pathlib import Path from typing import Any ENV_STATE_DIR = "LI_PLUS_WEBHOOK_STATE_DIR" +CLAIMS_FILENAME = "notification-claims.json" +SUCCESS_CONCLUSIONS = {"success", "skipped", "neutral"} +COMMENT_EVENT_TYPES = { + "discussion", + "discussion_comment", + "issue_comment", + "pull_request_review", + "pull_request_review_comment", +} + + +@dataclass(frozen=True) +class InspectContext: + repo: str | None + numbers: frozenset[str] + branches: frozenset[str] + internal_senders: frozenset[str] + + def as_payload(self) -> dict[str, Any]: + return { + "repo": self.repo, + "numbers": sorted(self.numbers), + "branches": sorted(self.branches), + } def default_workspace_root(script_path: Path | None = None) -> Path: @@ -45,6 +72,39 @@ def save_events(path: Path, events: list[dict[str, Any]]) -> None: path.write_text(json.dumps(events, ensure_ascii=False, indent=2), encoding="utf-8") +def claims_path(state_dir: Path) -> Path: + return state_dir / CLAIMS_FILENAME + + +def load_claims(state_dir: Path) -> dict[str, dict[str, Any]]: + path = claims_path(state_dir) + if not path.exists(): + return {} + return json.loads(path.read_text(encoding="utf-8")) + + +def save_claims(state_dir: Path, claims: dict[str, dict[str, Any]]) -> None: + path = claims_path(state_dir) + if claims: + path.write_text(json.dumps(claims, ensure_ascii=False, indent=2), encoding="utf-8") + return + if path.exists(): + path.unlink() + + +def clear_claims(state_dir: Path, ids: set[str]) -> None: + if not ids: + return + claims = load_claims(state_dir) + changed = False + for event_id in ids: + if event_id in claims: + del claims[event_id] + changed = True + if changed: + save_claims(state_dir, claims) + + def body_preview(payload: dict[str, Any]) -> str: for key in ("comment", "review", "discussion", "issue", "pull_request"): body = (payload.get(key) or {}).get("body") @@ -79,6 +139,7 @@ def item_title(payload: dict[str, Any]) -> str: def item_url(payload: dict[str, Any]) -> str: return ( (payload.get("comment") or {}).get("html_url") + or (payload.get("review") or {}).get("html_url") or (payload.get("issue") or {}).get("html_url") or (payload.get("pull_request") or {}).get("html_url") or (payload.get("discussion") or {}).get("html_url") @@ -88,10 +149,37 @@ def item_url(payload: dict[str, Any]) -> str: ) -def summarize(event: dict[str, Any]) -> dict[str, Any]: +def item_branches(payload: dict[str, Any]) -> list[str]: + branches = { + value + for value in ( + (payload.get("workflow_run") or {}).get("head_branch"), + (payload.get("check_suite") or {}).get("head_branch"), + ((payload.get("check_run") or {}).get("check_suite") or {}).get("head_branch"), + ((payload.get("pull_request") or {}).get("head") or {}).get("ref"), + ) + if value + } + return sorted(branches) + + +def item_conclusion(payload: dict[str, Any]) -> str: + return ( + (payload.get("check_run") or {}).get("conclusion") + or (payload.get("workflow_run") or {}).get("conclusion") + or "" + ) + + +def item_review_state(payload: dict[str, Any]) -> str: + return (payload.get("review") or {}).get("state") or "" + + +def summarize(event: dict[str, Any], *, claims: dict[str, dict[str, Any]] | None = None) -> dict[str, Any]: payload = event.get("payload") or {} + event_id = str(event.get("id")) return { - "id": event.get("id"), + "id": event_id, "type": event.get("type"), "action": payload.get("action"), "repo": (payload.get("repository") or {}).get("full_name"), @@ -101,6 +189,10 @@ def summarize(event: dict[str, Any]) -> dict[str, Any]: "url": item_url(payload), "received_at": event.get("received_at"), "preview": body_preview(payload), + "branches": item_branches(payload), + "conclusion": item_conclusion(payload), + "review_state": item_review_state(payload), + "claim": (claims or {}).get(event_id), } @@ -120,7 +212,13 @@ def delete_artifacts(paths: list[Path]) -> list[str]: return deleted -def remove_events(path: Path, events: list[dict[str, Any]], ids: set[str], *, state_dir: Path) -> tuple[list[str], list[str]]: +def remove_events( + path: Path, + events: list[dict[str, Any]], + ids: set[str], + *, + state_dir: Path, +) -> tuple[list[str], list[str]]: removed_ids: list[str] = [] deleted_paths: list[str] = [] if not ids: @@ -137,9 +235,217 @@ def remove_events(path: Path, events: list[dict[str, Any]], ids: set[str], *, st if removed_ids: save_events(path, remaining_events) + clear_claims(state_dir, set(removed_ids)) return removed_ids, deleted_paths +def mark_events_read(path: Path, events: list[dict[str, Any]], ids: set[str], *, state_dir: Path) -> list[str]: + if not ids: + return [] + + marked_ids: list[str] = [] + updated_events: list[dict[str, Any]] = [] + changed = False + for event in events: + event_id = str(event.get("id")) + if event_id in ids and not event.get("processed", False): + updated_event = dict(event) + updated_event["processed"] = True + updated_events.append(updated_event) + marked_ids.append(event_id) + changed = True + continue + updated_events.append(event) + + if changed: + save_events(path, updated_events) + clear_claims(state_dir, set(marked_ids)) + return marked_ids + + +def now_utc() -> datetime: + return datetime.now(timezone.utc) + + +def now_utc_iso() -> str: + return now_utc().replace(microsecond=0).isoformat().replace("+00:00", "Z") + + +def parse_timestamp(value: str | None) -> datetime | None: + if not value: + return None + try: + return datetime.fromisoformat(value.replace("Z", "+00:00")) + except ValueError: + return None + + +def infer_numbers_from_branches(branches: set[str]) -> set[str]: + numbers: set[str] = set() + pattern = re.compile(r"(?:^|/)(\d+)(?:[-/]|$)") + for branch in branches: + match = pattern.search(branch) + if match: + numbers.add(match.group(1)) + return numbers + + +def build_context(args: argparse.Namespace) -> InspectContext: + branches = {branch for branch in (args.branch or []) if branch} + numbers = {str(number) for number in (args.number or []) if str(number)} + if args.infer_numbers_from_branch: + numbers |= infer_numbers_from_branches(branches) + return InspectContext( + repo=args.repo or None, + numbers=frozenset(numbers), + branches=frozenset(branches), + internal_senders=frozenset(args.internal_sender or []), + ) + + +def is_internal_sender(sender: str | None, internal_senders: frozenset[str]) -> bool: + if not sender: + return False + if sender in internal_senders: + return True + return sender.endswith("[bot]") + + +def foreground_reasons(summary: dict[str, Any], context: InspectContext) -> list[str]: + if not context.repo or summary.get("repo") != context.repo: + return [] + + reasons: list[str] = [] + number = summary.get("number") + if context.numbers and number is not None and str(number) in context.numbers: + reasons.append("number") + + branches = set(summary.get("branches") or []) + if context.branches and branches.intersection(context.branches): + reasons.append("branch") + + return reasons + + +def notable_reason(summary: dict[str, Any], context: InspectContext) -> str | None: + if not context.repo or summary.get("repo") != context.repo: + return None + if is_internal_sender(summary.get("sender"), context.internal_senders): + return None + + event_type = summary.get("type") or "" + if event_type not in COMMENT_EVENT_TYPES: + return None + if event_type == "pull_request_review": + return "external_review" + if event_type == "pull_request_review_comment": + return "external_review_comment" + if event_type == "discussion": + return "external_discussion" + if event_type == "discussion_comment": + return "external_discussion_comment" + return "external_comment" + + +def is_cleanup_candidate( + summary: dict[str, Any], + context: InspectContext, + *, + older_than: timedelta, + current_time: datetime, +) -> bool: + if not context.repo or summary.get("repo") != context.repo: + return False + if not is_internal_sender(summary.get("sender"), context.internal_senders): + return False + if summary.get("type") not in {"check_run", "workflow_run"}: + return False + if summary.get("conclusion") not in SUCCESS_CONCLUSIONS: + return False + + received_at = parse_timestamp(summary.get("received_at")) + if received_at is None: + return False + return current_time - received_at >= older_than + + +def annotated_summary( + summary: dict[str, Any], + *, + relevant_reasons: list[str] | None = None, + notable: str | None = None, + cleanup_candidate: bool = False, +) -> dict[str, Any]: + annotated = dict(summary) + if relevant_reasons: + annotated["relevant_reasons"] = relevant_reasons + if notable: + annotated["notable_reason"] = notable + if cleanup_candidate: + annotated["cleanup_candidate"] = True + return annotated + + +def unique_by_id(items: list[dict[str, Any]]) -> list[dict[str, Any]]: + seen: set[str] = set() + result: list[dict[str, Any]] = [] + for item in items: + event_id = str(item.get("id")) + if event_id in seen: + continue + seen.add(event_id) + result.append(item) + return result + + +def tail(items: list[dict[str, Any]], limit: int) -> list[dict[str, Any]]: + if limit <= 0: + return items + return items[-limit:] + + +def evaluate_pending( + events: list[dict[str, Any]], + *, + claims: dict[str, dict[str, Any]], + context: InspectContext, + cleanup_after: timedelta, +) -> dict[str, list[dict[str, Any]]]: + pending_events = [event for event in events if not event.get("processed", False)] + pending_summaries = [summarize(event, claims=claims) for event in pending_events] + + relevant_items: list[dict[str, Any]] = [] + notable_items: list[dict[str, Any]] = [] + cleanup_candidates: list[dict[str, Any]] = [] + current_time = now_utc() + + for summary in pending_summaries: + reasons = foreground_reasons(summary, context) + if reasons: + relevant_items.append(annotated_summary(summary, relevant_reasons=reasons)) + + notable = notable_reason(summary, context) + if notable: + notable_items.append(annotated_summary(summary, notable=notable)) + + if is_cleanup_candidate(summary, context, older_than=cleanup_after, current_time=current_time): + cleanup_candidates.append(annotated_summary(summary, cleanup_candidate=True)) + + mention_ids = { + str(item.get("id")) + for item in relevant_items + notable_items + } + mention_items = [summary for summary in pending_summaries if str(summary.get("id")) in mention_ids] + + return { + "pending": pending_summaries, + "relevant": relevant_items, + "notable": notable_items, + "mention": unique_by_id(mention_items), + "cleanup": cleanup_candidates, + } + + def no_source_payload() -> dict[str, Any]: return { "source": "none", @@ -147,14 +453,23 @@ def no_source_payload() -> dict[str, Any]: "pending_count": 0, "consumed_count": 0, "remaining_count": 0, + "relevant_count": 0, + "notable_count": 0, + "mention_count": 0, + "cleanup_candidate_count": 0, + "context": {"repo": None, "numbers": [], "branches": []}, "items": [], + "relevant_items": [], + "notable_items": [], + "mention_items": [], + "cleanup_candidates": [], "deleted_paths": [], } def consume_pending(path: Path, events: list[dict[str, Any]], *, limit: int, state_dir: Path) -> dict[str, Any]: pending = [event for event in events if not event.get("processed", False)] - surfaced = pending[-limit:] if limit > 0 else pending + surfaced = tail([summarize(event) for event in pending], limit) pending_ids = {str(event.get("id")) for event in pending} removed_ids, deleted_paths = remove_events(path, events, pending_ids, state_dir=state_dir) return { @@ -163,24 +478,77 @@ def consume_pending(path: Path, events: list[dict[str, Any]], *, limit: int, sta "pending_count": len(pending), "consumed_count": len(removed_ids), "remaining_count": max(len(pending) - len(removed_ids), 0), - "items": [summarize(event) for event in surfaced], + "items": surfaced, "deleted_paths": deleted_paths, + "legacy_consume": True, } -def inspect_pending(path: Path, *, limit: int, state_dir: Path) -> dict[str, Any]: +def inspect_pending( + path: Path, + *, + limit: int, + state_dir: Path, + context: InspectContext, + cleanup_after: timedelta, +) -> dict[str, Any]: events = load_events(path) - pending = [event for event in events if not event.get("processed", False)] - selected = pending[-limit:] if limit > 0 else pending + claims = load_claims(state_dir) + evaluated = evaluate_pending(events, claims=claims, context=context, cleanup_after=cleanup_after) return { "source": "local_state_dir", "state_dir": str(state_dir), - "pending_count": len(pending), - "items": [summarize(event) for event in selected], + "pending_count": len(evaluated["pending"]), + "relevant_count": len(evaluated["relevant"]), + "notable_count": len(evaluated["notable"]), + "mention_count": len(evaluated["mention"]), + "cleanup_candidate_count": len(evaluated["cleanup"]), + "context": context.as_payload(), + "items": tail(evaluated["pending"], limit), + "relevant_items": tail(evaluated["relevant"], limit), + "notable_items": tail(evaluated["notable"], limit), + "mention_items": tail(evaluated["mention"], limit), + "cleanup_candidates": tail(evaluated["cleanup"], limit), } -def main() -> int: +def claim_ids( + state_dir: Path, + ids: list[str], + *, + claimant: str, + reason: str | None, + force: bool, +) -> tuple[list[str], list[dict[str, Any]]]: + claims = load_claims(state_dir) + claimed_ids: list[str] = [] + skipped: list[dict[str, Any]] = [] + timestamp = now_utc_iso() + + for event_id in ids: + existing = claims.get(event_id) + if existing and existing.get("claimed_by") not in {None, "", claimant} and not force: + skipped.append( + { + "id": event_id, + "claimed_by": existing.get("claimed_by"), + } + ) + continue + + claims[event_id] = { + "claimed_by": claimant, + "claimed_at": timestamp, + "reason": reason or "", + } + claimed_ids.append(event_id) + + if claimed_ids: + save_claims(state_dir, claims) + return claimed_ids, skipped + + +def parse_args() -> argparse.Namespace: parser = argparse.ArgumentParser(description="Inspect pending webhook notifications") parser.add_argument( "--workspace-root", @@ -192,20 +560,88 @@ def main() -> int: default=None, help="Directory containing github-webhook-mcp state files", ) - parser.add_argument("--limit", type=int, default=5, help="Maximum pending items to return") + parser.add_argument("--limit", type=int, default=5, help="Maximum items to return per bucket") + parser.add_argument("--repo", default=None, help="Foreground repository full name") + parser.add_argument("--number", action="append", default=[], help="Foreground issue/PR/discussion number") + parser.add_argument("--branch", action="append", default=[], help="Foreground branch name") + parser.add_argument( + "--infer-numbers-from-branch", + action="store_true", + help="Infer foreground issue numbers from branch names like spec/786-name", + ) + parser.add_argument( + "--internal-sender", + action="append", + default=[], + help="Sender login that should not be treated as external/notable", + ) parser.add_argument( "--consume", action="store_true", - help="Return pending summaries and delete the surfaced event logs immediately", + help="Legacy mode: delete every pending event and related artifacts immediately", ) parser.add_argument( "--ack", nargs="*", default=None, + help="Legacy alias for --done", + ) + parser.add_argument( + "--done", + nargs="*", + default=None, help="Delete the specified event ids and related generated files", ) + parser.add_argument( + "--read", + nargs="*", + default=None, + help="Mark the specified event ids as read without deleting them", + ) + parser.add_argument( + "--claim-matched", + action="store_true", + help="Claim all foreground-matched pending events", + ) + parser.add_argument("--claimant", default=None, help="Name stored in claim metadata") + parser.add_argument("--reason", default=None, help="Optional reason stored with a claim") + parser.add_argument( + "--force-claim", + action="store_true", + help="Allow a claimant to overwrite an existing claim held by someone else", + ) + parser.add_argument( + "--cleanup-safe-success", + action="store_true", + help="Delete old internal success check/workflow notifications for the foreground repo", + ) + parser.add_argument( + "--older-than-hours", + type=float, + default=24.0, + help="Age threshold used by --cleanup-safe-success", + ) args = parser.parse_args() + action_count = sum( + [ + bool(args.consume), + args.ack is not None, + args.done is not None, + args.read is not None, + bool(args.claim_matched), + bool(args.cleanup_safe_success), + ] + ) + if action_count > 1: + parser.error("choose only one action") + if args.claim_matched and not args.claimant: + parser.error("--claimant is required with --claim-matched") + return args + + +def main() -> int: + args = parse_args() workspace_root = Path(args.workspace_root).resolve() if args.workspace_root else default_workspace_root() configured_state_dir = args.state_dir or os.environ.get(ENV_STATE_DIR) state_dir = resolve_state_dir(configured_state_dir, workspace_root) @@ -215,17 +651,82 @@ def main() -> int: events_path = state_dir / "events.json" events = load_events(events_path) + context = build_context(args) + cleanup_after = timedelta(hours=args.older_than_hours) + + if args.ack is not None or args.done is not None: + ids = [event_id for event_id in (args.done if args.done is not None else args.ack) if event_id] + done_ids, deleted_paths = remove_events(events_path, events, set(ids), state_dir=state_dir) + print( + json.dumps( + { + "source": "local_state_dir", + "state_dir": str(state_dir), + "done_ids": done_ids, + "done_count": len(done_ids), + "deleted_paths": deleted_paths, + }, + ensure_ascii=False, + ) + ) + return 0 + + if args.read is not None: + ids = [event_id for event_id in args.read if event_id] + read_ids = mark_events_read(events_path, events, set(ids), state_dir=state_dir) + print( + json.dumps( + { + "source": "local_state_dir", + "state_dir": str(state_dir), + "read_ids": read_ids, + "read_count": len(read_ids), + }, + ensure_ascii=False, + ) + ) + return 0 + + if args.claim_matched: + claims = load_claims(state_dir) + evaluated = evaluate_pending(events, claims=claims, context=context, cleanup_after=cleanup_after) + claimable_ids = [str(item.get("id")) for item in evaluated["relevant"]] + claimed_ids, skipped = claim_ids( + state_dir, + claimable_ids, + claimant=args.claimant, + reason=args.reason, + force=args.force_claim, + ) + print( + json.dumps( + { + "source": "local_state_dir", + "state_dir": str(state_dir), + "claimed_ids": claimed_ids, + "claimed_count": len(claimed_ids), + "skipped": skipped, + "context": context.as_payload(), + }, + ensure_ascii=False, + ) + ) + return 0 - if args.ack is not None: - acked_ids, deleted_paths = remove_events(events_path, events, set(args.ack), state_dir=state_dir) + if args.cleanup_safe_success: + claims = load_claims(state_dir) + evaluated = evaluate_pending(events, claims=claims, context=context, cleanup_after=cleanup_after) + cleanup_ids = [str(item.get("id")) for item in evaluated["cleanup"]] + removed_ids, deleted_paths = remove_events(events_path, events, set(cleanup_ids), state_dir=state_dir) print( json.dumps( { "source": "local_state_dir", "state_dir": str(state_dir), - "acked_ids": acked_ids, - "acked_count": len(acked_ids), + "cleanup_ids": removed_ids, + "cleanup_count": len(removed_ids), "deleted_paths": deleted_paths, + "context": context.as_payload(), }, ensure_ascii=False, ) @@ -236,7 +737,18 @@ def main() -> int: print(json.dumps(consume_pending(events_path, events, limit=args.limit, state_dir=state_dir), ensure_ascii=False)) return 0 - print(json.dumps(inspect_pending(events_path, limit=args.limit, state_dir=state_dir), ensure_ascii=False)) + print( + json.dumps( + inspect_pending( + events_path, + limit=args.limit, + state_dir=state_dir, + context=context, + cleanup_after=cleanup_after, + ), + ensure_ascii=False, + ) + ) return 0 diff --git a/tests/test_check_webhook_notifications.py b/tests/test_check_webhook_notifications.py index 1d16079..59b1257 100644 --- a/tests/test_check_webhook_notifications.py +++ b/tests/test_check_webhook_notifications.py @@ -4,6 +4,7 @@ import sys import tempfile import unittest +from datetime import timedelta from pathlib import Path sys.path.insert(0, str(Path(__file__).resolve().parents[1] / "scripts")) @@ -12,6 +13,30 @@ class CheckWebhookNotificationsTest(unittest.TestCase): + REPO = "Liplus-Project/liplus-language" + BRANCH = "spec/786-notifications-layer" + + def make_state_dir(self) -> tuple[Path, Path, Path]: + tmp = tempfile.TemporaryDirectory() + self.addCleanup(tmp.cleanup) + + state_dir = Path(tmp.name) / "github-webhook-mcp" + trigger_dir = state_dir / "trigger-events" + runs_dir = state_dir / "codex-runs" + trigger_dir.mkdir(parents=True) + runs_dir.mkdir() + return state_dir, trigger_dir, runs_dir + + def inspect_context(self, **overrides: object) -> module.InspectContext: + data = { + "repo": self.REPO, + "numbers": frozenset({"786"}), + "branches": frozenset({self.BRANCH}), + "internal_senders": frozenset({"liplus-lin-lay", "lipluscodex"}), + } + data.update(overrides) + return module.InspectContext(**data) + def test_resolve_state_dir_uses_configured_relative_path(self) -> None: with tempfile.TemporaryDirectory() as tmp: workspace_root = Path(tmp) @@ -29,109 +54,310 @@ def test_resolve_state_dir_detects_parent_candidate(self) -> None: self.assertEqual(resolved, parent_candidate) - def test_consume_pending_removes_events_and_artifacts(self) -> None: - with tempfile.TemporaryDirectory() as tmp: - state_dir = Path(tmp) / "github-webhook-mcp" - trigger_dir = state_dir / "trigger-events" - runs_dir = state_dir / "codex-runs" - trigger_dir.mkdir(parents=True) - runs_dir.mkdir() - - events_path = state_dir / "events.json" - event = { + def test_infer_numbers_from_branch_names(self) -> None: + numbers = module.infer_numbers_from_branches({"spec/786-notifications-layer", "778-repo-first"}) + self.assertEqual(numbers, {"786", "778"}) + + def test_inspect_pending_classifies_foreground_notable_and_cleanup(self) -> None: + state_dir, _, _ = self.make_state_dir() + events_path = state_dir / "events.json" + now = module.now_utc() + recent = now - timedelta(hours=1) + old = now - timedelta(hours=48) + + events = [ + { "id": "evt-1", "type": "issue_comment", "processed": False, - "received_at": "2026-03-15T00:00:00Z", + "received_at": recent.replace(microsecond=0).isoformat().replace("+00:00", "Z"), "payload": { "action": "created", - "repository": {"full_name": "Liplus-Project/liplus-language"}, + "repository": {"full_name": self.REPO}, "sender": {"login": "master"}, - "issue": {"number": 730, "title": "Webhook path fallback"}, - "comment": { - "body": "見れる?", - "html_url": "https://github.com/Liplus-Project/liplus-language/issues/730#issuecomment-1", + "issue": {"number": 786, "title": "Notifications layer"}, + "comment": {"body": "please check", "html_url": "https://example.com/1"}, + }, + }, + { + "id": "evt-2", + "type": "workflow_run", + "processed": False, + "received_at": old.replace(microsecond=0).isoformat().replace("+00:00", "Z"), + "payload": { + "action": "completed", + "repository": {"full_name": self.REPO}, + "sender": {"login": "liplus-lin-lay"}, + "workflow_run": { + "name": "Liplus Governance CI", + "head_branch": self.BRANCH, + "conclusion": "success", + "html_url": "https://example.com/2", }, }, - } - events_path.write_text(json.dumps([event], ensure_ascii=False, indent=2), encoding="utf-8") - (trigger_dir / "evt-1.json").write_text("{}", encoding="utf-8") - (runs_dir / "evt-1.md").write_text("result", encoding="utf-8") + }, + { + "id": "evt-3", + "type": "issue_comment", + "processed": False, + "received_at": recent.replace(microsecond=0).isoformat().replace("+00:00", "Z"), + "payload": { + "action": "created", + "repository": {"full_name": self.REPO}, + "sender": {"login": "friend"}, + "issue": {"number": 999, "title": "Other thread"}, + "comment": {"body": "FYI", "html_url": "https://example.com/3"}, + }, + }, + { + "id": "evt-4", + "type": "pull_request_review", + "processed": False, + "received_at": recent.replace(microsecond=0).isoformat().replace("+00:00", "Z"), + "payload": { + "action": "submitted", + "repository": {"full_name": self.REPO}, + "sender": {"login": "reviewer"}, + "pull_request": { + "number": 55, + "title": "Notifications PR", + "html_url": "https://example.com/4", + "head": {"ref": self.BRANCH}, + }, + "review": { + "state": "changes_requested", + "body": "Needs work", + "html_url": "https://example.com/review", + }, + }, + }, + ] + events_path.write_text(json.dumps(events, ensure_ascii=False, indent=2), encoding="utf-8") + + payload = module.inspect_pending( + events_path, + limit=10, + state_dir=state_dir, + context=self.inspect_context(), + cleanup_after=timedelta(hours=24), + ) + + self.assertEqual(payload["pending_count"], 4) + self.assertEqual(payload["relevant_count"], 3) + self.assertEqual(payload["notable_count"], 3) + self.assertEqual(payload["mention_count"], 4) + self.assertEqual(payload["cleanup_candidate_count"], 1) + self.assertEqual({item["id"] for item in payload["relevant_items"]}, {"evt-1", "evt-2", "evt-4"}) + self.assertEqual({item["id"] for item in payload["notable_items"]}, {"evt-1", "evt-3", "evt-4"}) + self.assertEqual([item["id"] for item in payload["cleanup_candidates"]], ["evt-2"]) - payload = module.consume_pending(events_path, [event], limit=5, state_dir=state_dir) + def test_mark_events_read_preserves_history(self) -> None: + state_dir, _, _ = self.make_state_dir() + events_path = state_dir / "events.json" + event = { + "id": "evt-1", + "type": "issue_comment", + "processed": False, + "received_at": "2026-03-15T00:00:00Z", + "payload": { + "action": "created", + "repository": {"full_name": self.REPO}, + "sender": {"login": "master"}, + "issue": {"number": 786, "title": "Notifications layer"}, + "comment": {"body": "seen?", "html_url": "https://example.com/1"}, + }, + } + events_path.write_text(json.dumps([event], ensure_ascii=False, indent=2), encoding="utf-8") + + read_ids = module.mark_events_read(events_path, [event], {"evt-1"}, state_dir=state_dir) + payload = module.inspect_pending( + events_path, + limit=5, + state_dir=state_dir, + context=self.inspect_context(), + cleanup_after=timedelta(hours=24), + ) + + self.assertEqual(read_ids, ["evt-1"]) + self.assertEqual(payload["pending_count"], 0) + saved = json.loads(events_path.read_text(encoding="utf-8")) + self.assertTrue(saved[0]["processed"]) - self.assertEqual(payload["pending_count"], 1) - self.assertEqual(payload["consumed_count"], 1) - self.assertEqual(payload["remaining_count"], 0) - self.assertEqual(payload["items"][0]["id"], "evt-1") - self.assertEqual(json.loads(events_path.read_text(encoding="utf-8")), []) - self.assertFalse((trigger_dir / "evt-1.json").exists()) - self.assertFalse((runs_dir / "evt-1.md").exists()) + def test_claim_ids_preserves_existing_claim_by_default(self) -> None: + state_dir, _, _ = self.make_state_dir() + + claimed_ids, skipped = module.claim_ids( + state_dir, + ["evt-1"], + claimant="Lin", + reason="foreground", + force=False, + ) + self.assertEqual(claimed_ids, ["evt-1"]) + self.assertEqual(skipped, []) + + claimed_ids, skipped = module.claim_ids( + state_dir, + ["evt-1"], + claimant="Lay", + reason="other session", + force=False, + ) + self.assertEqual(claimed_ids, []) + self.assertEqual(skipped[0]["claimed_by"], "Lin") + + def test_cleanup_safe_success_removes_only_old_internal_success(self) -> None: + state_dir, trigger_dir, runs_dir = self.make_state_dir() + events_path = state_dir / "events.json" + old = "2020-01-01T00:00:00Z" + + success_event = { + "id": "evt-1", + "type": "workflow_run", + "processed": False, + "received_at": old, + "payload": { + "action": "completed", + "repository": {"full_name": self.REPO}, + "sender": {"login": "liplus-lin-lay"}, + "workflow_run": { + "name": "Liplus Governance CI", + "head_branch": "build-2026-03-19.3", + "conclusion": "success", + "html_url": "https://example.com/workflow", + }, + }, + } + comment_event = { + "id": "evt-2", + "type": "issue_comment", + "processed": False, + "received_at": old, + "payload": { + "action": "created", + "repository": {"full_name": self.REPO}, + "sender": {"login": "master"}, + "issue": {"number": 900, "title": "Keep me"}, + "comment": {"body": "still relevant", "html_url": "https://example.com/comment"}, + }, + } + events = [success_event, comment_event] + events_path.write_text(json.dumps(events, ensure_ascii=False, indent=2), encoding="utf-8") + for event in events: + (trigger_dir / f"{event['id']}.json").write_text("{}", encoding="utf-8") + (runs_dir / f"{event['id']}.md").write_text("result", encoding="utf-8") + + evaluated = module.evaluate_pending( + events, + claims={}, + context=self.inspect_context(numbers=frozenset(), branches=frozenset()), + cleanup_after=timedelta(hours=24), + ) + removed_ids, deleted_paths = module.remove_events( + events_path, + events, + {item["id"] for item in evaluated["cleanup"]}, + state_dir=state_dir, + ) + + self.assertEqual(removed_ids, ["evt-1"]) + self.assertIn(str(trigger_dir / "evt-1.json"), deleted_paths) + self.assertIn(str(runs_dir / "evt-1.md"), deleted_paths) + remaining = json.loads(events_path.read_text(encoding="utf-8")) + self.assertEqual([event["id"] for event in remaining], ["evt-2"]) + + def test_consume_pending_removes_events_and_artifacts(self) -> None: + state_dir, trigger_dir, runs_dir = self.make_state_dir() + events_path = state_dir / "events.json" + event = { + "id": "evt-1", + "type": "issue_comment", + "processed": False, + "received_at": "2026-03-15T00:00:00Z", + "payload": { + "action": "created", + "repository": {"full_name": self.REPO}, + "sender": {"login": "master"}, + "issue": {"number": 730, "title": "Webhook path fallback"}, + "comment": { + "body": "見れる?", + "html_url": "https://github.com/Liplus-Project/liplus-language/issues/730#issuecomment-1", + }, + }, + } + events_path.write_text(json.dumps([event], ensure_ascii=False, indent=2), encoding="utf-8") + (trigger_dir / "evt-1.json").write_text("{}", encoding="utf-8") + (runs_dir / "evt-1.md").write_text("result", encoding="utf-8") + + payload = module.consume_pending(events_path, [event], limit=5, state_dir=state_dir) + + self.assertEqual(payload["pending_count"], 1) + self.assertEqual(payload["consumed_count"], 1) + self.assertEqual(payload["remaining_count"], 0) + self.assertEqual(payload["items"][0]["id"], "evt-1") + self.assertEqual(json.loads(events_path.read_text(encoding="utf-8")), []) + self.assertFalse((trigger_dir / "evt-1.json").exists()) + self.assertFalse((runs_dir / "evt-1.md").exists()) def test_consume_pending_drains_backlog_beyond_limit(self) -> None: - with tempfile.TemporaryDirectory() as tmp: - state_dir = Path(tmp) / "github-webhook-mcp" - trigger_dir = state_dir / "trigger-events" - runs_dir = state_dir / "codex-runs" - trigger_dir.mkdir(parents=True) - runs_dir.mkdir() - - events_path = state_dir / "events.json" - events = [ - { - "id": "evt-1", - "type": "issues", - "processed": False, - "received_at": "2026-03-15T00:00:00Z", - "payload": { - "action": "opened", - "repository": {"full_name": "Liplus-Project/liplus-language"}, - "issue": {"number": 1, "title": "one"}, - }, + state_dir, trigger_dir, runs_dir = self.make_state_dir() + events_path = state_dir / "events.json" + events = [ + { + "id": "evt-1", + "type": "issues", + "processed": False, + "received_at": "2026-03-15T00:00:00Z", + "payload": { + "action": "opened", + "repository": {"full_name": self.REPO}, + "issue": {"number": 1, "title": "one"}, }, - { - "id": "evt-2", - "type": "issues", - "processed": False, - "received_at": "2026-03-15T00:01:00Z", - "payload": { - "action": "opened", - "repository": {"full_name": "Liplus-Project/liplus-language"}, - "issue": {"number": 2, "title": "two"}, - }, + }, + { + "id": "evt-2", + "type": "issues", + "processed": False, + "received_at": "2026-03-15T00:01:00Z", + "payload": { + "action": "opened", + "repository": {"full_name": self.REPO}, + "issue": {"number": 2, "title": "two"}, }, - { - "id": "evt-3", - "type": "issues", - "processed": False, - "received_at": "2026-03-15T00:02:00Z", - "payload": { - "action": "opened", - "repository": {"full_name": "Liplus-Project/liplus-language"}, - "issue": {"number": 3, "title": "three"}, - }, + }, + { + "id": "evt-3", + "type": "issues", + "processed": False, + "received_at": "2026-03-15T00:02:00Z", + "payload": { + "action": "opened", + "repository": {"full_name": self.REPO}, + "issue": {"number": 3, "title": "three"}, }, - ] - events_path.write_text(json.dumps(events, ensure_ascii=False, indent=2), encoding="utf-8") - for event in events: - (trigger_dir / f"{event['id']}.json").write_text("{}", encoding="utf-8") - (runs_dir / f"{event['id']}.md").write_text("result", encoding="utf-8") - - payload = module.consume_pending(events_path, events, limit=1, state_dir=state_dir) - - self.assertEqual(payload["pending_count"], 3) - self.assertEqual(payload["consumed_count"], 3) - self.assertEqual(payload["remaining_count"], 0) - self.assertEqual([item["id"] for item in payload["items"]], ["evt-3"]) - self.assertEqual(json.loads(events_path.read_text(encoding="utf-8")), []) - for event in events: - self.assertFalse((trigger_dir / f"{event['id']}.json").exists()) - self.assertFalse((runs_dir / f"{event['id']}.md").exists()) + }, + ] + events_path.write_text(json.dumps(events, ensure_ascii=False, indent=2), encoding="utf-8") + for event in events: + (trigger_dir / f"{event['id']}.json").write_text("{}", encoding="utf-8") + (runs_dir / f"{event['id']}.md").write_text("result", encoding="utf-8") + + payload = module.consume_pending(events_path, events, limit=1, state_dir=state_dir) + + self.assertEqual(payload["pending_count"], 3) + self.assertEqual(payload["consumed_count"], 3) + self.assertEqual(payload["remaining_count"], 0) + self.assertEqual([item["id"] for item in payload["items"]], ["evt-3"]) + self.assertEqual(json.loads(events_path.read_text(encoding="utf-8")), []) + for event in events: + self.assertFalse((trigger_dir / f"{event['id']}.json").exists()) + self.assertFalse((runs_dir / f"{event['id']}.md").exists()) def test_no_source_payload_is_silent_noop(self) -> None: payload = module.no_source_payload() self.assertEqual(payload["source"], "none") self.assertEqual(payload["pending_count"], 0) + self.assertEqual(payload["relevant_count"], 0) self.assertEqual(payload["items"], []) From 36088981469a2e895fb83f8f1b72b19b790546e5 Mon Sep 17 00:00:00 2001 From: liplus-lin-lay Date: Thu, 19 Mar 2026 22:07:47 +0900 Subject: [PATCH 3/5] chore(claude): suppress successful ci webhook chatter MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Claude Code hook の foreground notification 表示から、成功した内部 check_run / workflow_run を除外するようにした。 inspect-first helper は維持したまま、会話前に流れ込む webhook 文脈のノイズだけを減らすための追補変更。 Refs #786 --- Li+claude.md | 26 ++++++++++++++++---------- 1 file changed, 16 insertions(+), 10 deletions(-) diff --git a/Li+claude.md b/Li+claude.md index 503453e..0dc58e8 100644 --- a/Li+claude.md +++ b/Li+claude.md @@ -90,31 +90,37 @@ fi RESULT=$(python3 "$HELPER" "${HELPER_ARGS[@]}" 2>/dev/null) [ -z "$RESULT" ] && exit 0 -MENTION_COUNT=$(printf '%s' "$RESULT" | python3 -c "import sys,json; print(json.load(sys.stdin).get('mention_count',0))" 2>/dev/null) -if [ -z "$MENTION_COUNT" ] || [ "$MENTION_COUNT" = "0" ]; then - exit 0 -fi - -echo "" -echo "━━━ Webhook: foreground/notable notification(s) ━━━" -printf '%s' "$RESULT" | python3 -c ' +DISPLAY_LINES=$(printf '%s' "$RESULT" | python3 -c ' import json import sys +SUCCESS = {"success", "skipped", "neutral"} data = json.load(sys.stdin) seen = set() +lines = [] for bucket, label in (("relevant_items", "foreground"), ("notable_items", "notable")): for item in data.get(bucket, []): event_id = item.get("id") if event_id in seen: continue + if bucket == "relevant_items": + if item.get("type") in {"check_run", "workflow_run"} and item.get("conclusion") in SUCCESS: + continue seen.add(event_id) number = item.get("number") title = item.get("title") or item.get("type") or "notification" event_type = item.get("type") or "event" prefix = f"#{number} " if number is not None else "" - print(f"[{label}] {event_type} {prefix}{title}") -' + lines.append(f"[{label}] {event_type} {prefix}{title}") +print("\\n".join(lines)) +' 2>/dev/null) +if [ -z "$DISPLAY_LINES" ]; then + exit 0 +fi + +echo "" +echo "━━━ Webhook: foreground/notable notification(s) ━━━" +printf '%s\n' "$DISPLAY_LINES" echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" ``` From df059409455fb677f084f4a3ae87f395451647bd Mon Sep 17 00:00:00 2001 From: liplus-lin-lay Date: Thu, 19 Mar 2026 22:32:03 +0900 Subject: [PATCH 4/5] fix(core): add notifications layer to layer definition MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Li+core.md の Layer Definition がまだ 4 layers のままだったため、Notifications Layer と attachment chain を追加して docs / 実装との差分を解消した。 Claude review で指摘された core と docs の不整合をこの PR 内で揃えるための修正。 Refs #786 --- Li+core.md | 8 ++++++-- 1 file changed, 6 insertions(+), 2 deletions(-) diff --git a/Li+core.md b/Li+core.md index 45f34cd..0c78633 100644 --- a/Li+core.md +++ b/Li+core.md @@ -100,7 +100,7 @@ Purpose: reproduce judgment across sessions and across different AIs. Layer Definition ---------------- -Four layers. Each program file declares its own layer membership. +Five layers. Each program file declares its own layer membership. This section defines layer roles only, not file names. Model Layer: @@ -115,12 +115,16 @@ Operations Layer: branch / commit / change request / verification / merge / release procedures. Event-driven surface. Loaded on demand, not every session. +Notifications Layer: + notification ownership, claim/read/done, foreground mention rules, cleanup rules. + Shared queue semantics across GitHub Notifications API, webhook, and fallback state stores. + Adapter Layer: host injection, runtime triggers, reread wiring, platform-specific bindings. Connects Li+ program to the host environment. Attachment chain: -model -> task -> operations -> adapter +model -> task -> operations -> notifications -> adapter Attachment chain = dependency order only Cross-layer rule: From 382859e32cb8932c8cf60e28e4bc4d06df031e6d Mon Sep 17 00:00:00 2001 From: liplus-lin-lay Date: Thu, 19 Mar 2026 22:44:49 +0900 Subject: [PATCH 5/5] refactor(core): keep layer definition structural only MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Li+core.md の Layer Definition から各レイヤーの役割説明を外し、core では存在と attachment chain だけを定義する形へ薄くした。 レイヤーの詳細責務は各 layer file 側へ寄せるという review 指摘に合わせた整理。 Refs #786 --- Li+core.md | 30 +++++++++--------------------- 1 file changed, 9 insertions(+), 21 deletions(-) diff --git a/Li+core.md b/Li+core.md index 0c78633..1de9365 100644 --- a/Li+core.md +++ b/Li+core.md @@ -101,27 +101,15 @@ Purpose: reproduce judgment across sessions and across different AIs. ---------------- Five layers. Each program file declares its own layer membership. -This section defines layer roles only, not file names. - -Model Layer: - invariants, intra-layer order, dialogue surface, behavioral style, task mode. - Foundation of the Li+ program. All other layers depend on this. - -Task Layer: - issue rules, label vocabulary, issue-body convergence, parent/child structure. - Defines how work units are tracked and managed. - -Operations Layer: - branch / commit / change request / verification / merge / release procedures. - Event-driven surface. Loaded on demand, not every session. - -Notifications Layer: - notification ownership, claim/read/done, foreground mention rules, cleanup rules. - Shared queue semantics across GitHub Notifications API, webhook, and fallback state stores. - -Adapter Layer: - host injection, runtime triggers, reread wiring, platform-specific bindings. - Connects Li+ program to the host environment. +Core defines layer existence and attachment order only. +Detailed role definitions belong to each layer file. + +Layers: + Model Layer + Task Layer + Operations Layer + Notifications Layer + Adapter Layer Attachment chain: model -> task -> operations -> notifications -> adapter