You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
<divclass="speaker-notes" hidden="">On the consumption side, Aliki addressed the human experience — dark mode, search, responsive layout. For AI consumption, we looked at the options and found nothing with proven impact. Let me walk through what we evaluated.</div>
<h2class="reveal">Rustdoc: community and maintainers said no</h2>
580
584
<ulclass="reveal">
581
-
<li>The most complete standard for LLM-friendly docs</li>
582
-
<li><strongstyle="color:var(--text)">10% adoption</strong> across 300K domains, <strongstyle="color:var(--accent)">zero measurable impact</strong> on AI citations</li>
583
-
<li>No major LLM provider officially consumes it</li>
585
+
<li>RFC #3751 proposed LLM-friendly text output for Rustdoc</li>
586
+
<li>T-rustdoc team: "Very likely not the desired format within 3 months, never mind 3 years"</li>
587
+
<li>Community consensus: "This should be an external tool"</li>
<divclass="speaker-notes" hidden="">I actually prototyped llms.txt generation for RDoc. Then I looked at the data. SE Ranking studied 300,000 domains — only 10% adoption. An XGBoost model for predicting AI citations actually improved when llms.txt was removed as a variable. No major LLM provider officially supports it.</div>
591
+
<divclass="speaker-notes" hidden="">Rustdoc had an RFC proposing LLM-friendly text output. Both the community and a T-rustdoc team member pushed back. workingjubilee from the rustdoc team said: the format you build today very likely isn't the format you'll need in three months, let alone three years. The broader community agreed it should be an external tool consuming rustdoc's existing JSON output.</div>
<li>Enabled by default — every Elixir project gets it</li>
601
+
<li>First major doc tool to ship AI-oriented output</li>
596
602
</ul>
597
603
</div><divclass="slide-num">33</div>
598
-
<divclass="speaker-notes" hidden="">We also considered generating Markdown output alongside HTML — a format AI could consume directly. But there's no standard for how to serve it, no consensus on what AI tools would actually use, and the maintenance cost would fall on every RDoc user. The risk outweighs the reward right now.</div>
604
+
<divclass="speaker-notes" hidden="">Meanwhile, Elixir's ExDoc took the opposite approach. Version 0.40.0 shipped a Markdown formatter and llms.txt generation, enabled by default. Every Elixir project now produces machine-readable documentation out of the box. As far as I know, they're the first major doc tool to ship AI-oriented output features.</div>
<li>Rustdoc RFC (rust-lang/rfcs#3751): proposed LLM-friendly text output</li>
606
-
<li>Community rejected it: "This should be an external tool"</li>
607
-
<li>T-rustdoc team: "Very likely not the desired format within 3 months, never mind 3 years"</li>
612
+
<li>The most complete standard for LLM-friendly docs</li>
613
+
<li><strongstyle="color:var(--text)">~10% adoption</strong> across 300K domains, <strongstyle="color:var(--accent)">zero measurable impact</strong> on AI citations ¹</li>
614
+
<li>No major LLM provider officially consumes it</li>
608
615
</ul>
616
+
<pclass="reveal" style="position:absolute;bottom:clamp(0.8rem, 2vw, 1.5rem);left:clamp(1rem, 3vw, 2rem);font-size:clamp(0.65rem, 1vw, 0.8rem);color:var(--text-dim)">¹ SE Ranking, "LLMs.txt: Why Brands Rely On It and Why It Doesn't Work" (seranking.com/blog/llms-txt)</p>
609
617
</div><divclass="slide-num">34</div>
610
-
<divclass="speaker-notes" hidden="">We're not the only ones wrestling with this. Rustdoc had an RFC proposing LLM-friendly text output. The community rejected it — the consensus was that this belongs in external tooling, not in the doc generator itself. The Rustdoc team's argument was blunt: the format you build today very likely isn't the format you'll need in three months, let alone three years.</div>
618
+
<divclass="speaker-notes" hidden="">I actually prototyped llms.txt generation for RDoc. Then I looked at the data. SE Ranking studied 300,000 domains — only 10% adoption. An XGBoost model for predicting AI citations actually improved when llms.txt was removed as a variable. No major LLM provider officially supports it. We decided it's not worth the complexity right now.</div>
<h2class="reveal">RDoc: studying while building the foundation</h2>
616
625
<ulclass="reveal">
617
-
<li>No major doc generator has shipped AI-specific output features</li>
618
-
<li>All AI-docs integration happens at the consumption layer (MCP servers, Dash, DevDocs)</li>
619
-
<li>Doc generators should focus on producing good output</li>
626
+
<li>We're watching how these approaches play out</li>
627
+
<li>Priority now: make the documentation better at the source</li>
628
+
<li>Any AI-specific format risks being obsolete in months</li>
620
629
</ul>
621
-
<pclass="reveal" style="margin-top:var(--content-gap);font-size:var(--body-size);color:var(--text-secondary)">If I missed anything in this space, I'd love to hear about it after the talk.</p>
622
630
</div><divclass="slide-num">35</div>
623
-
<divclass="speaker-notes" hidden="">As far as I can tell, no major documentation generator — Rustdoc, Javadoc, Sphinx, TypeDoc — has shipped AI-specific output features. All the AI-docs integration is happening at the consumption layer: MCP servers, Dash, DevDocs. Doc generators should focus on producing good, structured output, and let the consumption layer evolve independently. If I missed anything in this space, I'd genuinely like to hear about it.</div>
631
+
<divclass="speaker-notes" hidden="">Different communities are making different bets. We're studying these approaches — what works, what doesn't — while we focus on improving the foundation of RDoc. Better docs at the source benefit every consumption method, current and future. We're not opposed to shipping AI-specific output, but we want to do it when the landscape settles, not chase a moving target.</div>
624
632
</section>
625
633
626
-
<!-- SLIDE 31: DOCS.RUBY-LANG.ORG REACHES MORE THAN HUMANS -->
<h2class="reveal">docs.ruby-lang.org reaches more than humans now</h2>
629
-
<ulclass="reveal">
630
-
<li>AI models train on it. AI tools reference it when generating code.</li>
631
-
<li>Improving Ruby's docs improves every AI tool that uses them.</li>
632
-
</ul>
633
-
</div><divclass="slide-num">36</div>
634
-
<divclass="speaker-notes" hidden="">docs.ruby-lang.org isn't just read by developers anymore. AI models train on it, and AI coding tools reference it when generating Ruby code. So when we improve Ruby's documentation, we're not just helping the people who read the docs directly — we're improving every AI tool that consumes them.</div>
<h2class="reveal">Types in docs, not type checking at generation</h2>
640
-
<ulclass="reveal">
641
-
<li>Endoh benchmark (March 2026): Ruby is #1 for AI code generation ($0.36/run)</li>
642
-
<li>Adding a type checker at generation time costs 2-3x overhead</li>
643
-
<li>Types already in documentation give AI the information without that cost</li>
644
-
</ul>
645
-
</div><divclass="slide-num">37</div>
646
-
<divclass="speaker-notes" hidden="">Yusuke Endoh's benchmark from March 2026 showed Ruby is number one for AI code generation cost at $0.36 per run. Adding a type checker at generation time would cost 2-3x overhead. But types already present in documentation give AI the same information without that runtime cost. That's the value of RBS in docs.</div>
647
-
</section>
648
-
649
-
<!-- SLIDE 33: WE DON'T KNOW WHAT AI WILL NEED IN A YEAR -->
<h2class="reveal">We don't know what AI will need in a year</h2>
652
-
<ulclass="reveal">
653
-
<li>llms.txt looked promising six months ago. The data says otherwise.</li>
654
-
<li>Any AI-specific format risks being obsolete in months.</li>
655
-
<li>So we focus on Markdown, types, clean architecture — things that last.</li>
656
-
</ul>
657
-
</div><divclass="slide-num">38</div>
658
-
<divclass="speaker-notes" hidden="">We don't know what AI tools will need in a year. llms.txt looked promising six months ago — the data says it doesn't help. Any AI-specific format we build today risks being obsolete by the time it ships. The Rust community reached the same conclusion in their rustdoc discussion. So we focus on things that last regardless of how AI evolves: Markdown, type signatures, clean architecture.</div>
<li>AI reads Ruby's docs too now — improving them helps everyone</li>
666
639
<li>RDoc's priority: make docs easier to write and maintain</li>
667
640
<li>The foundation is rebuilt: Prism, Markdown, server mode, RBS, Aliki</li>
668
641
</ul>
669
-
</div><divclass="slide-num">39</div>
642
+
</div><divclass="slide-num">36</div>
670
643
<divclass="speaker-notes" hidden="">Three takeaways. AI reads Ruby's documentation too now — so improving it helps both developers and the tools they use. RDoc's priority is making docs easier to write and maintain. And the foundation is rebuilt — Prism parser, Markdown support, server mode, RBS type signatures, Aliki theme.</div>
671
644
</section>
672
645
@@ -681,7 +654,7 @@ <h2 class="reveal">Try it today</h2>
<divclass="speaker-notes" hidden="">The most important message: it's now much easier to contribute to Ruby documentation. If you've ever wanted to improve Ruby's docs but didn't know the markup, that barrier is gone — it's Markdown. If you maintain a gem, try rdoc --server. It takes thirty seconds.</div>
<divclass="speaker-notes" hidden="">Thank you for your time. I want to thank tompng — Tomoya Ishida — for the three subsystem rewrites that made everything else possible. The Shopify Ruby DX team for countless ideas and support. And the Ruby committers who reviewed the many PRs. I'll be around for questions.</div>
0 commit comments