Skip to content

Conversation

@thephez
Copy link
Collaborator

@thephez thephez commented Jan 14, 2026

Add sphinx-sitemap extension to generate a complete sitemap with all individual page URLs, improving search engine discoverability for documentation pages that were previously not being indexed by Google.

Summary by CodeRabbit

  • New Features

    • Automatic XML sitemap generation for documentation; search results and the main index are excluded.
    • Added a robots.txt to guide crawlers and reference the sitemap.
  • Chores

    • Added sitemap support and related documentation-build dependencies to enable sitemap and robots output.

✏️ Tip: You can customize this high-level summary in your review settings.

Add sphinx-sitemap extension to generate a complete sitemap with all individual page URLs, improving search engine discoverability for documentation pages that were previously not being indexed by Google.
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Jan 14, 2026

Warning

Rate limit exceeded

@thephez has exceeded the limit for the number of commits that can be reviewed per hour. Please wait 6 minutes and 9 seconds before requesting another review.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

📥 Commits

Reviewing files that changed from the base of the PR and between 3868b1a and 96eede1.

📒 Files selected for processing (1)
  • _extra/robots.txt
📝 Walkthrough

Walkthrough

Adds Sphinx sitemap generation: includes sphinx_sitemap in conf.py, configures sitemap settings (base URL, URL scheme, locales, excludes, lastmod, indent), adds required Python packages to requirements, and adds an autogenerated robots.txt under _extra/.

Changes

Cohort / File(s) Summary
Sphinx Configuration
conf.py
Added sphinx_sitemap to extensions and configured sitemap settings: html_baseurl, sitemap_url_scheme, sitemap_locales, sitemap_excludes (search.html, genindex.html), sitemap_show_lastmod, and sitemap_indent.
Python Dependencies
requirements.in, requirements.txt
Added sphinx-sitemap==2.9.0 to requirements.in; updated requirements.txt with sphinx-sitemap==2.9.0 and sphinx-last-updated-by-git==0.3.8 (comments note provenance).
Site extras
_extra/robots.txt
Added autogenerated robots.txt configuring crawling (disallowing /en/latest/) and referencing the sitemap at https://docs.dash.org/en/stable/sitemap.xml.

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~10 minutes

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The pull request title accurately reflects the main change: adding sphinx-sitemap extension for improved Google indexing, which is demonstrated across all modified files.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Fix all issues with AI agents
In `@_extra/robots.txt`:
- Around line 1-8: The header comment in _extra/robots.txt incorrectly claims
the file is "autogenerated by Read the Docs"; update the top comment in the
robots.txt file to remove or change the autogeneration claim so it accurately
reflects that the file is manually maintained (e.g., replace "This robots.txt
file is autogenerated by Read the Docs." with a line like "This robots.txt file
is maintained in version control and controls crawling and indexing of the
documentation."). Ensure the new header still references the Read the Docs
guidance links if desired and commit the updated robots.txt.
🧹 Nitpick comments (1)
_extra/robots.txt (1)

13-13: Consider adding a trailing newline.

The file ends without a trailing newline, which deviates from POSIX text file conventions. While not critical, adding a newline at the end can prevent warnings from some tools.

📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 73d5979 and c5b685a.

📒 Files selected for processing (1)
  • _extra/robots.txt
🔇 Additional comments (3)
_extra/robots.txt (3)

9-9: LGTM!

The User-agent directive is correctly configured to apply to all search engine crawlers.


11-11: Verify the intent to hide /en/latest/ from search engines.

The Disallow directive prevents search engines from indexing the /en/latest/ version. Ensure this aligns with your documentation strategy—typically, you'd want to hide unstable/development versions and only index stable releases.


13-13: The sitemap URL is correctly configured and requires no changes.

The https://docs.dash.org/en/stable/sitemap.xml URL properly aligns with the html_baseurl = 'https://docs.dash.org/en/stable/' setting in conf.py, and the file ends with a trailing newline as expected.

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.

@thephez
Copy link
Collaborator Author

thephez commented Jan 14, 2026

@coderabbitai review

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Jan 14, 2026

✅ Actions performed

Review triggered.

Note: CodeRabbit is an incremental review system and does not re-review already reviewed commits. This command is applicable only when automatic reviews are paused.

@thephez thephez merged commit 99ae164 into dashpay:23.0.0 Jan 15, 2026
1 check passed
@thephez thephez deleted the sitemap branch January 15, 2026 14:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant