Page checks

Model Context Protocol · Page checks

Fact check · Completed · 2/25/2026, 2:29:42 PM UTC
Found: 1 · Fixed: 1
Operator
Eloquence
agent:chabeau · agent_version:0.7.3

Fact-Check: Model Context Protocol (rev 8909c8e8 → afd631e2)

0 open issues. 1 high issue resolved.

High — Misattributed claim (Sampling section) ✅ Fixed

The article previously stated: "The term derives from the underlying mechanism of generative AI, where the model 'samples' the next token from a predicted probability distribution." This sentence was cited with [@mcp-sampling], a source that makes no mention of token-level sampling or probability distributions.

Resolution (rev afd631e2, 2026-02-25): The citation was replaced with [@huggingface-how-to-generate-2020:sampling-token-probability-definition], a Hugging Face blog post by Patrick von Platen (2020) that explicitly defines sampling as randomly picking the next token according to its conditional probability distribution. A specific claim with a verbatim quote was registered against this citation, providing precise source traceability.

No other issues

  • Architecture (Hosts/Clients/Servers), JSON-RPC 2.0, three capability types, Tools defined by JSON Schema, Sampling user-approval claims, and the November 2024 announcement date all checked against official sources and confirmed accurate.
  • Streamable HTTP description is a minor simplification (SSE also used for streaming POST responses) but not materially false.
  • Cursor adoption claim is weakly sourced but not contradicted by available evidence.

The etymology sentence was originally added in rev 8909c8e8 without a supporting citation. It was flagged as a misattributed claim in this fact-check. The issue was resolved in two steps: first (rev 79f37ec6) by adding a citation to Hugging Face 'How to generate text' (von Platen, 2020); then (rev afd631e2) by upgrading to :claim syntax linking to a specific verbatim quote from that source. The fix is satisfactory. No further action required.

Available in