.png)
We ran five documentation analysis tools against eight open-source repositories last week. Not a survey. Not opinions. Automated analysis of real docs in real repos.
The results tell a clear story: the projects with the best contributor experience are the ones that treat documentation with the same rigor as code.
We built a set of documentation quality skills and tested them against repos of different sizes, languages, and doc structures:
Stripe Node SDK: 848 links, zero broken. That is not an accident. That is a team that checks their docs the same way they check their code.
FastAPI: 132 broken links across 100+ doc files. FastAPI has excellent content, but cross-document references have drifted as the docs grew.
RedwoodJS: 103 broken links. Many are anchor mismatches, headings that were renamed without updating the links that point to them.
The phrase gets used loosely. Here is what it means in practice, based on what we saw across these repos.
Every project we tested keeps documentation in the same repository as the source code. Not a separate wiki. Not a Notion page. Same repo, same PR workflow, same review process.
This matters because when docs and code live apart, they drift apart. A developer changes a function signature in src/auth.ts and has no reason to check if docs/authentication.md references it.
When they are in the same repo, a single PR can update both. The reviewer sees both changes together. The CI pipeline can check both.
The difference between Stripe's 0 broken links and FastAPI's 132 is not that Stripe's team is more careful. It is that broken links get caught before they merge.
The types of broken links we found:
Anchor drift: A heading gets renamed from "## Setup instructions" to "## Setup" but the link #setup-instructions elsewhere in the docs is not updated. We found this in Fastify (59 instances in Ecosystem.md alone) and RedwoodJS.
Cross-doc references: [authentication guide](./auth.md) stops working when auth.md gets renamed to authentication.md. FastAPI had 132 of these.
Orphaned pages: Documentation files that exist but are not linked from any other page. We found 19 orphaned pages in one project. Content that is effectively invisible to readers.
A link checker in CI catches all of these before they reach users.
When code changes, which documentation pages are now stale?
We analyzed recent code changes against documentation files and found:
Headers symbol was modified in test files, and docs referencing it had not been updated.The projects with zero stale docs were not the ones with less activity. They were the ones where documentation updates happen in the same PR as code changes.
We scanned the public API surface of each project and checked if documentation existed:
FastAPI stands out here. The 37% overall coverage includes all 230 public items (endpoints, internal functions, utilities). But when you break it down, 84% of their user-facing API endpoints are documented, compared to only 5% of internal functions. They document what matters to users and do not waste effort on internals. That is a deliberate coverage strategy, not a gap.
Hoppscotch at 2% is expected. Their docs are user-facing product docs, not API reference. The tool correctly classified this as "product docs" and noted that low coverage is normal for that docs type.
GitHub's Open Source Survey found that 93% of respondents said incomplete or outdated documentation is a pervasive problem. 60% said they rarely or never contribute to documentation.
This creates a cycle: contributors do not update docs because docs are already out of date, so docs fall further behind, so fewer people bother.
The projects that break this cycle are the ones that make documentation a first-class part of the contribution workflow:
If you maintain an open-source project, run these checks against your docs. Each one maps directly to a tool you can add to CI in under 10 minutes.
Broken links: How many internal links point to files or anchors that do not exist? If the number is above zero, a reader will hit a dead end. Run a link checker against your docs/ directory and fix every broken reference before your next release.
Orphaned pages: How many documentation files are not linked from any other page? These are invisible to anyone navigating your docs. A docs coverage scan reveals pages that exist but have no inbound links. Either link them from the right place or remove them.
Stale references: Which documentation pages reference functions, endpoints, or config options that have changed in recent commits? Compare your recent git history against docs files to find pages that have not been updated since the code they describe was modified.
Coverage: What percentage of your exported functions, classes, and endpoints have corresponding documentation? Scan your public API surface and check if each exported symbol has a matching entry in your docs.
These are not opinions about writing quality. They are measurable, automatable checks that tell you the structural health of your docs.
Here is a GitHub Actions workflow that runs link checking on every PR that touches documentation:
# .github/workflows/docs-ci.yml
name: Docs CI
on:
pull_request:
paths:
- 'docs/**'
- '*.md'
jobs:
check-links:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Check for broken links
uses: lycheeverse/lychee-action@v1
with:
args: --no-progress 'docs/**/*.md' '*.md'
fail: trueThis catches broken links before they merge. It adds less than 30 seconds to your CI pipeline and prevents the kind of link rot we found in Fastify (59 broken links in a single file) and FastAPI (132 across the project).
For stale reference detection and coverage analysis, you need tools that understand the relationship between your code and your docs. That is what we built.
We open-sourced the documentation analysis tools we used for this experiment as Claude Code skills:
They are at github.com/ekline-io/ekline-docs-skills. MIT licensed. Each skill is backed by a Python script that does the analysis deterministically. No prompt engineering, no LLM interpretation of your docs.
The documentation quality gap in open source is structural, not cultural. The projects that close it are the ones that automate the checks. If your project has more than 20 documentation files and no CI for docs, you have broken links. You just do not know which ones yet.