-
Notifications
You must be signed in to change notification settings - Fork 84
feat: add sphinx-sitemap for improved Google indexing #554
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Add sphinx-sitemap extension to generate a complete sitemap with all individual page URLs, improving search engine discoverability for documentation pages that were previously not being indexed by Google.
|
Warning Rate limit exceeded
⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. 📒 Files selected for processing (1)
📝 WalkthroughWalkthroughAdds Sphinx sitemap generation: includes Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~10 minutes 🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🤖 Fix all issues with AI agents
In `@_extra/robots.txt`:
- Around line 1-8: The header comment in _extra/robots.txt incorrectly claims
the file is "autogenerated by Read the Docs"; update the top comment in the
robots.txt file to remove or change the autogeneration claim so it accurately
reflects that the file is manually maintained (e.g., replace "This robots.txt
file is autogenerated by Read the Docs." with a line like "This robots.txt file
is maintained in version control and controls crawling and indexing of the
documentation."). Ensure the new header still references the Read the Docs
guidance links if desired and commit the updated robots.txt.
🧹 Nitpick comments (1)
_extra/robots.txt (1)
13-13: Consider adding a trailing newline.The file ends without a trailing newline, which deviates from POSIX text file conventions. While not critical, adding a newline at the end can prevent warnings from some tools.
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
_extra/robots.txt
🔇 Additional comments (3)
_extra/robots.txt (3)
9-9: LGTM!The User-agent directive is correctly configured to apply to all search engine crawlers.
11-11: Verify the intent to hide /en/latest/ from search engines.The Disallow directive prevents search engines from indexing the
/en/latest/version. Ensure this aligns with your documentation strategy—typically, you'd want to hide unstable/development versions and only index stable releases.
13-13: The sitemap URL is correctly configured and requires no changes.The
https://docs.dash.org/en/stable/sitemap.xmlURL properly aligns with thehtml_baseurl = 'https://docs.dash.org/en/stable/'setting in conf.py, and the file ends with a trailing newline as expected.
✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.
|
@coderabbitai review |
✅ Actions performedReview triggered.
|
Add sphinx-sitemap extension to generate a complete sitemap with all individual page URLs, improving search engine discoverability for documentation pages that were previously not being indexed by Google.
Summary by CodeRabbit
New Features
Chores
✏️ Tip: You can customize this high-level summary in your review settings.