Skip to content

feat: prototype MCP docs generation via mcpdocs-gen#1013

Draft
Aaron ("AJ") Steers (aaronsteers) wants to merge 1 commit into
mainfrom
devin/1776390943-mcp-docs-prototype
Draft

feat: prototype MCP docs generation via mcpdocs-gen#1013
Aaron ("AJ") Steers (aaronsteers) wants to merge 1 commit into
mainfrom
devin/1776390943-mcp-docs-prototype

Conversation

@aaronsteers
Copy link
Copy Markdown
Member

@aaronsteers Aaron ("AJ") Steers (aaronsteers) commented Apr 17, 2026

Summary

Prototype end-to-end workflow for generating static HTML documentation for the PyAirbyte MCP server using mcpdocs-gen. The generator introspects a running FastMCP server over SSE and emits a searchable static site covering tools, resources, and prompts — positioned as a possible "pdoc3 for MCP servers."

Changes:

  • scripts/generate_mcp_docs.py — standalone script that starts airbyte.mcp.server:app in SSE mode as a background subprocess (in its own process group), TCP-healthchecks the port, shells out to mcpdocs generate, and tears the server down cleanly on success, failure, or Ctrl+C (SIGTERM with SIGKILL fallback).
  • pyproject.toml — adds poe mcp-docs-generate task alias.
  • .gitignore — ignores the generated docs/mcp-generated/ output directory.
  • docs/CONTRIBUTING.md — adds a short section on how to regenerate MCP docs locally.

mcpdocs-gen is not added to project dependencies — it's installed ad-hoc via uv pip install mcpdocs-gen per the contributing doc, since this is an exploration PR and we don't want to commit to the tool as a first-party dep yet.

Smoke test: the script ran end-to-end against the live PyAirbyte MCP server and produced a site documenting 51 tools, 1 resource, and 1 prompt. See screenshots and candid evaluation in the session report.

Review & Testing Checklist for Human

  • Run uv pip install mcpdocs-gen && poe mcp-docs-generate on a fresh checkout and confirm docs/mcp-generated/index.html renders all 51 tools with arguments, descriptions, and JSON schemas.
  • Sanity-check subprocess cleanup: interrupt the script with Ctrl+C mid-run and verify no orphaned Python process is listening on port 8765 afterward (lsof -iTCP:8765 or ss -ltnp).
  • Decide whether the "install mcpdocs-gen separately" UX is acceptable for a prototype, or whether it should be pinned under a dev dependency group before merge.
  • Review the candid evaluation in the session report and decide whether to adopt mcpdocs-gen org-wide, restrict it to one-off dumps, or pick a different tool.

Notes

  • The script defaults to port 8765 (not 8000) to avoid colliding with the existing poe mcp-serve-sse task, which hardcodes 127.0.0.1:8000. --port is configurable.
  • Per the task's non-goals: the MCP server's tool definitions/docstrings were not modified; any doc-quality gaps in the generated output reflect the current state of tool annotations and are called out in the evaluation, not papered over.
  • Lint (uv run ruff check ., uv run ruff format --check .) and type checking (uv run pyrefly check — repo uses pyrefly, not mypy) all pass locally.

Link to Devin session: https://app.devin.ai/sessions/359e794efeb844b2a8adf02b5831f999
Requested by: Aaron ("AJ") Steers (@aaronsteers)

Summary by CodeRabbit

Release Notes

  • New Features

    • Added support for generating static HTML documentation for the MCP server.
  • Documentation

    • Added instructions for generating MCP server documentation with setup details.
  • Chores

    • Updated version control configuration to exclude generated documentation directories.

Adds a reproducible workflow for generating static HTML documentation
for the PyAirbyte MCP server using mcpdocs-gen.

- scripts/generate_mcp_docs.py: starts the MCP server in SSE mode,
  waits for readiness, shells out to mcpdocs, and tears down the
  server subprocess cleanly (incl. SIGINT handling).
- poe mcp-docs-generate: task alias that runs the script.
- docs/CONTRIBUTING.md: how-to snippet for local regeneration.
- .gitignore: ignore docs/mcp-generated/ output directory.

This is a prototype/evaluation PR; see the PR description for a candid
evaluation of mcpdocs-gen's fitness as PyAirbyte's long-term MCP docs
generator.
@devin-ai-integration
Copy link
Copy Markdown
Contributor

🤖 Devin AI Engineer

I'll be helping with this pull request! Here's what you should know:

✅ I will automatically:

  • Address comments on this PR. Add '(aside)' to your comment to have me ignore it.
  • Look at CI failures and help fix them

Note: I can only respond to comments from users who have write access to this repository.

⚙️ Control Options:

  • Disable automatic comment and CI monitoring

@github-actions
Copy link
Copy Markdown

👋 Greetings, Airbyte Team Member!

Here are some helpful tips and reminders for your convenience.

💡 Show Tips and Tricks

Testing This PyAirbyte Version

You can test this version of PyAirbyte using the following:

# Run PyAirbyte CLI from this branch:
uvx --from 'git+https://github.com/airbytehq/PyAirbyte.git@devin/1776390943-mcp-docs-prototype' pyairbyte --help

# Install PyAirbyte from this branch for development:
pip install 'git+https://github.com/airbytehq/PyAirbyte.git@devin/1776390943-mcp-docs-prototype'

PR Slash Commands

Airbyte Maintainers can execute the following slash commands on your PR:

  • /fix-pr - Fixes most formatting and linting issues
  • /uv-lock - Updates uv.lock file
  • /test-pr - Runs tests with the updated PyAirbyte
  • /prerelease - Builds and publishes a prerelease version to PyPI
📚 Show Repo Guidance

Helpful Resources

Community Support

Questions? Join the #pyairbyte channel in our Slack workspace.

📝 Edit this welcome message.

@devin-ai-integration
Copy link
Copy Markdown
Contributor

Screenshots

Index page (tool list, search bar, dark-mode toggle):

mcpdocs-gen index

Tool detail view (parameters table + JSON schema):

mcpdocs-gen tool detail

Candid evaluation of mcpdocs-gen

What it does well

  • Works end-to-end out of the box. uv pip install mcpdocs-gen → start server → one CLI invocation → static HTML. No config, templates, build system, or Node toolchain. This is the biggest single win vs. rolling our own.
  • Speaks MCP natively, not just "Python functions". It pulls the introspection payload from a live SSE server, so the generated docs reflect the actual protocol surface (tools + params + JSON schemas + resources + prompts) rather than Python signatures. Refactor the Python, re-run the CLI, docs stay true.
  • Documents all three MCP primitives. Tools, resources, and prompts each get their own section. Resource templates render correctly.
  • JSON schemas render well. Each tool has a collapsible "Show schema" block with the raw JSON schema pretty-printed — useful when a param description is weak but the schema is specific (enum values, min/max).
  • Zero-dep static output. Plain HTML/CSS/JS in one directory. Publishable to S3, GH Pages, a CDN, or zipped for airgapped review with no extra build step.
  • Readable default styling. Clean typography, built-in dark-mode toggle, client-side fuzzy search across tool names/descriptions, filter chips for tools vs. resources vs. prompts. Feels like a modern devdocs site, not a 2005 Javadoc.
  • Fail-loud on missing schemas. Bad/missing JSON schemas show up immediately in the UI — harder to hide bad annotations.

What it does poorly / is missing

  • No offline / static-introspection mode. You must spin up a live SSE server to generate docs. Fine locally, awkward in CI — you end up with a subprocess + healthcheck + teardown dance (exactly what scripts/generate_mcp_docs.py does). A "given a dumped tools.json, render the site" mode would be much more CI-friendly.
  • Not Docusaurus / MDX compatible. Output is a self-contained site, not a set of .md/.mdx fragments. You can't drop it into an existing Docusaurus sidebar; at best you iframe or link out. For Airbyte this matters — our user docs live in Docusaurus.
  • Search is local/fuzzy-only. Client-side JS filter. No pre-built index, no keyboard shortcut, no result snippets, no cross-reference search. For 51 tools it's fine; for a big org-wide aggregate across multiple MCP servers it would feel thin.
  • No deep-linking / stable anchors. Tool pages appear to use JS-driven routing rather than one-HTML-file-per-tool. Hard to link a teammate to "the exact docs for list_connectors" from a Slack thread.
  • No multi-server / aggregate mode. One invocation documents one running server. No first-class "point at 3 MCP servers, emit one unified site" option. For Airbyte (PyAirbyte + airbyte-ops-mcp + sonar + …) you'd have to stitch sites together yourself.
  • No versioning. No "v0.44.1 vs v0.44.0" diff, no version selector, no changelog surface. Regenerating overwrites.
  • No public theming/templating API. Hard to add an Airbyte header, "report a bug" link, or custom footer without forking. Task explicitly prohibits forking, so this is a real constraint.
  • Minimal customization of what's shown. No obvious way to hide internal tools, group tools into categories, or add prose sections between generated content.
  • Underdocumented tools look bad. mcpdocs-gen faithfully renders whatever the server advertises. Several PyAirbyte tools have terse or missing descriptions, and the output makes that immediately visible. Per this task's non-goals I didn't edit any docstrings, but it's worth flagging: the generated site is a quality mirror of our MCP tool annotations, which is both a feature (finds gaps) and a liability (public-looking docs expose them).
  • Small / early project. mcpdocs-gen 0.1.3, single-maintainer repo, no obvious release cadence. Adopting it org-wide is a bet on that continuing.

Is it pdoc3-quality for MCP?

Roughly, for a prototype — no, not yet for polished public docs. pdoc3 has years of polish around navigation, stable URLs, cross-references, search, and theming. mcpdocs-gen is closer to "pdoc3 circa its first public release" than "pdoc3 today": the idea is right (auto-generate from introspection, static output, zero ceremony), and the MVP is good enough for internal dev references and architecture reviews, but the polish (deep-linking, Docusaurus integration, multi-project aggregation, theming, offline mode) isn't there yet. It's the best free option I've seen targeted specifically at MCP, and the first one that's obviously "shaped like pdoc3" — but it has real gaps before it can replace human-curated docs.

Recommendation

Adopt for internal/dev-facing uses now. Don't bet user-facing docs on it yet.

  • Yes, adopt for "I want to quickly browse what tools a given MCP server exposes" — internal architecture reviews, PR reviews of MCP surface changes, and one-off dumps attached to incident reports. Install-and-run cost is near zero and the output is genuinely readable.
  • Yes, adopt the script pattern from this PR across other Airbyte MCP servers (airbyte-ops-mcp, etc.) — the generator is server-agnostic, so the same script with a different entrypoint works. Low cost, high value.
  • No, don't rely on it for customer/partner-facing MCP docs on docs.airbyte.com. Missing Docusaurus/MDX integration, stable URLs, multi-server aggregation, and versioning are blockers there.
  • Revisit in 6 months. If the project adds offline mode + MDX output + deep-linking it becomes a strong candidate for public docs. Otherwise it stays in the "useful internal tool" bucket and we supplement with a bespoke MDX generator on top of fastmcp inspect or wait for a more mature alternative.

cc Aaron ("AJ") Steers (@aaronsteers) — would love your read on the evaluation and on whether it's worth rolling out this script pattern to the other Airbyte MCP servers.


Devin session

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai Bot commented Apr 17, 2026

📝 Walkthrough

Walkthrough

The PR introduces infrastructure to automatically generate static HTML documentation for the MCP server. It includes a new Python script that orchestrates spawning an MCP server, waiting for readiness, invoking the mcpdocs generator, and cleaning up server processes. Supporting configuration and documentation are added alongside a git-ignore rule for the generated output directory.

Changes

Cohort / File(s) Summary
Configuration & Build
.gitignore, pyproject.toml
Added git ignore rule for generated docs directory and created new Poe task mcp-docs-generate to execute the documentation generation script.
Documentation
docs/CONTRIBUTING.md
Added new section describing the MCP server documentation generation workflow, including tool setup, task execution, and output location with reference to the implementation script.
Documentation Generation
scripts/generate_mcp_docs.py
New executable script that manages the full lifecycle of MCP docs generation: spawning the server via subprocess, waiting for port readiness, running mcpdocs CLI against the SSE endpoint, and performing cleanup with signal handling and timeout escalation.

Sequence Diagram

sequenceDiagram
    actor User
    participant CLI as generate_mcp_docs.py
    participant Server as MCP Server<br/>(subprocess)
    participant PortCheck as Port Checker
    participant McpDocs as mcpdocs generate
    participant Filesystem as File System

    User->>CLI: poe mcp-docs-generate
    CLI->>Server: spawn via subprocess (SSE mode)
    CLI->>PortCheck: wait_for_port(host:port, timeout)
    PortCheck->>PortCheck: poll TCP connection
    PortCheck-->>CLI: port ready
    CLI->>McpDocs: run generate --url http://host:port/sse
    McpDocs->>Server: connect via SSE endpoint
    Server-->>McpDocs: stream server metadata
    McpDocs->>Filesystem: write HTML docs to output dir
    McpDocs-->>CLI: success
    CLI->>Server: terminate process group (SIGTERM)
    Server-->>CLI: shutdown
    CLI->>Filesystem: output available at docs/mcp-generated/
    CLI-->>User: exit(0)
Loading

Estimated Code Review Effort

🎯 3 (Moderate) | ⏱️ ~22 minutes

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'feat: prototype MCP docs generation via mcpdocs-gen' directly summarizes the main change—adding a prototype workflow for MCP documentation generation using the external tool mcpdocs-gen.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
📝 Generate docstrings
  • Create stacked PR
  • Commit on current branch
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch devin/1776390943-mcp-docs-prototype

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (4)
scripts/generate_mcp_docs.py (3)

44-58: TCP connect readiness vs. SSE endpoint readiness — possible race?

_wait_for_port returns as soon as the socket accepts a connection, but that doesn't guarantee FastMCP has finished mounting the /sse route. In practice uvicorn typically binds after the ASGI app is ready, so this is usually fine — just flagging in case you've seen flakiness. If it does become flaky, a tiny HTTP GET to /sse (or any known endpoint) with a short retry would be more robust. No action needed if the smoke test has been stable.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@scripts/generate_mcp_docs.py` around lines 44 - 58, _wait_for_port currently
only checks TCP connectability which can return before the FastMCP app has
mounted the /sse route; to make readiness checks robust change _wait_for_port to
perform an actual HTTP GET against the SSE endpoint (e.g., GET /sse or another
known health endpoint) with a short per-attempt timeout and retry loop until the
overall deadline, falling back to the existing TCP check if needed; update
references to _wait_for_port to expect the same signature and ensure the raised
TimeoutError message includes the last HTTP error or status for debugging.

104-108: Stale output directory not cleaned before regeneration — intentional?

output.mkdir(parents=True, exist_ok=True) preserves prior contents, so if tools/resources are removed from the MCP server between runs, their old HTML pages will linger in docs/mcp-generated/. Would it be worth doing a shutil.rmtree(output, ignore_errors=True) before the mkdir (or delegating to mcpdocs if it has a clean flag), wdyt?

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@scripts/generate_mcp_docs.py` around lines 104 - 108, The current generation
preserves prior contents by calling output.mkdir(...), which leaves stale pages
behind; before invoking subprocess.run([mcpdocs_bin, "generate", ...],
check=True) remove the existing output directory (use shutil.rmtree(output,
ignore_errors=True)) and then recreate it (output.mkdir(parents=True,
exist_ok=True)) so the generated docs directory is clean; reference the
variables output and mcpdocs_bin and the subprocess.run call when making the
change.

72-77: Silencing server stderr makes startup failures hard to diagnose — capture it?

With stdout=DEVNULL and stderr=STDOUT, if the subprocess crashes on import (bad env, port in use, etc.), _wait_for_port will just time out after 60s with no breadcrumbs. Would you consider piping stderr to a tempfile (or subprocess.PIPE + drain on timeout) so the TimeoutError message can include the last few lines of server output, wdyt?

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@scripts/generate_mcp_docs.py` around lines 72 - 77, The subprocess is
currently started with stdout=DEVNULL and stderr=STDOUT which discards server
error output; change the Popen call in scripts/generate_mcp_docs.py to capture
stderr (e.g., create a tempfile.NamedTemporaryFile(mode='w+b') or use
subprocess.PIPE and store the handle) and pass that handle as stderr to
subprocess.Popen (keep stdout=DEVNULL if desired). Then, in the _wait_for_port
function (or wherever the TimeoutError is raised), read the last N lines from
that tempfile (or drain the PIPE) on timeout and include those lines in the
TimeoutError message; ensure the tempfile/pipe is closed/cleaned up after use
and that file descriptors are passed correctly to Popen.
pyproject.toml (1)

177-177: Task wiring looks good — consider documenting the mcpdocs-gen install step as a task too?

Since mcpdocs-gen isn't a declared dep (per the PR description, intentionally ad-hoc for the prototype), running poe mcp-docs-generate on a fresh checkout will fail with the "CLI not found" error from the script until users read the CONTRIBUTING section. Would it be worth adding a sibling mcp-docs-install task (or a shell that does uv pip install mcpdocs-gen && python scripts/...) to make the prototype one-shot-runnable, wdyt? Happy to leave as-is for now if you'd rather keep the dependency decision explicit per the PR checklist.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@pyproject.toml` at line 177, Add a sibling poe task to make the docs
generation one-shot: create a new task named mcp-docs-install that installs the
ad-hoc CLI then invokes the existing generator (so it runs uv pip install
mcpdocs-gen && python scripts/generate_mcp_docs.py or separate install then
call), referencing the existing mcp-docs-generate task name so callers can
choose installation+generate in one command; ensure the task help text explains
it installs mcpdocs-gen before generating docs.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@scripts/generate_mcp_docs.py`:
- Around line 61-93: The current subprocess management in _start_mcp_server and
_stop_mcp_server assumes POSIX-only APIs (start_new_session, os.killpg,
signal.SIGKILL) and will fail on Windows; update the functions to detect the
platform (using sys.platform or os.name) and provide a Windows-safe fallback:
when on POSIX keep the existing start_new_session and os.killpg flow, but when
on Windows avoid start_new_session and instead use the Popen instance directly
(proc.terminate()/proc.kill()) and proc.kill() as the hard-kill fallback,
catching and suppressing ProcessLookupError/AttributeError as needed;
alternatively, if Windows support is out of scope, raise a clear, early
PlatformError in _start_mcp_server explaining the script is POSIX-only so
callers get a friendly error instead of failing in the finally of
_stop_mcp_server.

---

Nitpick comments:
In `@pyproject.toml`:
- Line 177: Add a sibling poe task to make the docs generation one-shot: create
a new task named mcp-docs-install that installs the ad-hoc CLI then invokes the
existing generator (so it runs uv pip install mcpdocs-gen && python
scripts/generate_mcp_docs.py or separate install then call), referencing the
existing mcp-docs-generate task name so callers can choose installation+generate
in one command; ensure the task help text explains it installs mcpdocs-gen
before generating docs.

In `@scripts/generate_mcp_docs.py`:
- Around line 44-58: _wait_for_port currently only checks TCP connectability
which can return before the FastMCP app has mounted the /sse route; to make
readiness checks robust change _wait_for_port to perform an actual HTTP GET
against the SSE endpoint (e.g., GET /sse or another known health endpoint) with
a short per-attempt timeout and retry loop until the overall deadline, falling
back to the existing TCP check if needed; update references to _wait_for_port to
expect the same signature and ensure the raised TimeoutError message includes
the last HTTP error or status for debugging.
- Around line 104-108: The current generation preserves prior contents by
calling output.mkdir(...), which leaves stale pages behind; before invoking
subprocess.run([mcpdocs_bin, "generate", ...], check=True) remove the existing
output directory (use shutil.rmtree(output, ignore_errors=True)) and then
recreate it (output.mkdir(parents=True, exist_ok=True)) so the generated docs
directory is clean; reference the variables output and mcpdocs_bin and the
subprocess.run call when making the change.
- Around line 72-77: The subprocess is currently started with stdout=DEVNULL and
stderr=STDOUT which discards server error output; change the Popen call in
scripts/generate_mcp_docs.py to capture stderr (e.g., create a
tempfile.NamedTemporaryFile(mode='w+b') or use subprocess.PIPE and store the
handle) and pass that handle as stderr to subprocess.Popen (keep stdout=DEVNULL
if desired). Then, in the _wait_for_port function (or wherever the TimeoutError
is raised), read the last N lines from that tempfile (or drain the PIPE) on
timeout and include those lines in the TimeoutError message; ensure the
tempfile/pipe is closed/cleaned up after use and that file descriptors are
passed correctly to Popen.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Repository UI

Review profile: CHILL

Plan: Pro

Run ID: 758bef0f-7cde-4674-a378-3bfc40d0cc63

📥 Commits

Reviewing files that changed from the base of the PR and between ce1a589 and f62b5d1.

📒 Files selected for processing (4)
  • .gitignore
  • docs/CONTRIBUTING.md
  • pyproject.toml
  • scripts/generate_mcp_docs.py

Comment on lines +61 to +93
def _start_mcp_server(host: str, port: int) -> subprocess.Popen[bytes]:
"""Start the PyAirbyte MCP server in SSE mode as a background subprocess."""
cmd = [
sys.executable,
"-c",
(
"from airbyte.mcp.server import app; "
f"app.run(transport='sse', host={host!r}, port={port})"
),
]
# Start in its own process group so we can signal the whole tree on shutdown.
return subprocess.Popen(
cmd,
stdout=subprocess.DEVNULL,
stderr=subprocess.STDOUT,
start_new_session=True,
)


def _stop_mcp_server(proc: subprocess.Popen[bytes]) -> None:
"""Terminate the MCP server subprocess tree cleanly."""
if proc.poll() is not None:
return
try:
os.killpg(proc.pid, signal.SIGTERM)
except ProcessLookupError:
return
try:
proc.wait(timeout=10.0)
except subprocess.TimeoutExpired:
with contextlib.suppress(ProcessLookupError):
os.killpg(proc.pid, signal.SIGKILL)
proc.wait(timeout=5.0)
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot Apr 17, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

POSIX-only process group handling — is Windows support in scope for this prototype?

start_new_session=True is a no-op on Windows, and os.killpg / signal.SIGKILL aren't available there, so _stop_mcp_server will raise AttributeError on Windows even in the "happy path" (it runs unconditionally in the finally). Given the project supports Python 3.10–3.12 on Windows per requires-python and there are Windows-specific pytest filters, would you like to either (a) gate this script to POSIX with a friendly error up-front, or (b) add a Windows fallback using proc.terminate() / proc.kill(), wdyt?

🛠️ Sketch of a cross-platform fallback
 def _start_mcp_server(host: str, port: int) -> subprocess.Popen[bytes]:
     ...
+    popen_kwargs: dict = {"stdout": subprocess.DEVNULL, "stderr": subprocess.STDOUT}
+    if os.name == "posix":
+        popen_kwargs["start_new_session"] = True
+    else:
+        popen_kwargs["creationflags"] = subprocess.CREATE_NEW_PROCESS_GROUP  # type: ignore[attr-defined]
-    return subprocess.Popen(
-        cmd,
-        stdout=subprocess.DEVNULL,
-        stderr=subprocess.STDOUT,
-        start_new_session=True,
-    )
+    return subprocess.Popen(cmd, **popen_kwargs)


 def _stop_mcp_server(proc: subprocess.Popen[bytes]) -> None:
     if proc.poll() is not None:
         return
-    try:
-        os.killpg(proc.pid, signal.SIGTERM)
-    except ProcessLookupError:
-        return
+    if os.name == "posix":
+        try:
+            os.killpg(proc.pid, signal.SIGTERM)
+        except ProcessLookupError:
+            return
+    else:
+        proc.terminate()
     try:
         proc.wait(timeout=10.0)
     except subprocess.TimeoutExpired:
-        with contextlib.suppress(ProcessLookupError):
-            os.killpg(proc.pid, signal.SIGKILL)
+        with contextlib.suppress(ProcessLookupError):
+            if os.name == "posix":
+                os.killpg(proc.pid, signal.SIGKILL)
+            else:
+                proc.kill()
         proc.wait(timeout=5.0)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@scripts/generate_mcp_docs.py` around lines 61 - 93, The current subprocess
management in _start_mcp_server and _stop_mcp_server assumes POSIX-only APIs
(start_new_session, os.killpg, signal.SIGKILL) and will fail on Windows; update
the functions to detect the platform (using sys.platform or os.name) and provide
a Windows-safe fallback: when on POSIX keep the existing start_new_session and
os.killpg flow, but when on Windows avoid start_new_session and instead use the
Popen instance directly (proc.terminate()/proc.kill()) and proc.kill() as the
hard-kill fallback, catching and suppressing ProcessLookupError/AttributeError
as needed; alternatively, if Windows support is out of scope, raise a clear,
early PlatformError in _start_mcp_server explaining the script is POSIX-only so
callers get a friendly error instead of failing in the finally of
_stop_mcp_server.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks CodeRabbit (@coderabbitai) — acknowledged. All four are reasonable, but per the task brief this is a deliberately minimal prototype ("don't over-engineer") so I'm leaving them as follow-ups rather than expanding scope here:

  1. TCP vs. SSE readiness race — smoke test was stable across a dozen runs; noted in case it flakes later.
  2. Stale output directory — worth adding; small enough to include if AJ agrees the PR should grow.
  3. Captured stderr on startup failure — worth adding; same caveat.
  4. Windows subprocess fallback — out of scope for the prototype, but worth adding if this pattern moves into an internal tool library alongside scripts/generate_mcp_docs.py.
  5. Sibling mcp-docs-install task — intentionally skipped, per the PR description: we don't want to commit to mcpdocs-gen as a first-party dep until the eval is accepted.

Happy to fold any/all of these in if AJ wants the prototype hardened before merge.


Devin session

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Skipped: comment is from another GitHub bot.

@github-actions
Copy link
Copy Markdown

PyTest Results (Fast Tests Only, No Creds)

343 tests  ±0   343 ✅ ±0   5m 50s ⏱️ ±0s
  1 suites ±0     0 💤 ±0 
  1 files   ±0     0 ❌ ±0 

Results for commit f62b5d1. ± Comparison against base commit ce1a589.

@github-actions
Copy link
Copy Markdown

PyTest Results (Full)

413 tests  ±0   395 ✅ ±0   25m 54s ⏱️ - 1m 33s
  1 suites ±0    18 💤 ±0 
  1 files   ±0     0 ❌ ±0 

Results for commit f62b5d1. ± Comparison against base commit ce1a589.

devin-ai-integration Bot added a commit that referenced this pull request Apr 17, 2026
Introduces `scripts/generate_mcp_markdown.py` (exposed via `poe mcp-docs-md`)
which introspects the MCP server with `fastmcp inspect` and renders a small
set of Markdown files under `docs/mcp-generated/`:

- index.md — server overview + counts + TOC
- tools.md — one H2 per tool with a GFM parameters table and collapsible
  input/output JSON schemas
- resources.md — concrete resources and resource templates
- prompts.md — prompts and their arguments

Formatting is modeled on `mcpdocs-gen` (evaluated in PR #1013) but emitted
as plain CommonMark + GFM + YAML front-matter + `<details>` blocks, so the
pages render correctly in both Docusaurus and `pdoc` without MDX-only
components. Every tool/resource/prompt has a stable slug anchor for
deep-linking.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant