Skip to content

Commit

Permalink
Tell crawlers not to index comment permalink pages
Browse files Browse the repository at this point in the history
We don't want to spend crawl budget or rank on what are essentially
duplicate pages. In case we have inbound links to these pages, we don't
want the robots.txt to prevent crawlers from accessing them.
  • Loading branch information
prashtx committed Oct 19, 2016
1 parent 94a7083 commit 098d047
Showing 1 changed file with 7 additions and 1 deletion.
8 changes: 7 additions & 1 deletion r2/r2/lib/pages/pages.py
Original file line number Diff line number Diff line change
Expand Up @@ -1745,7 +1745,13 @@ def __init__(self, link = None, comment = None, disable_comments=False,
self.num_duplicates = num_duplicates

self.show_promote_button = show_promote_button
robots = "noindex,nofollow" if link._deleted or link._spam else None
if link._deleted or link._spam:
robots = "noindex,nofollow"
elif comment:
# We don't want crawlers to index the comment permalink pages.
robots = "noindex"
else:
robots = None

if 'extra_js_config' not in kw:
kw['extra_js_config'] = {}
Expand Down

0 comments on commit 098d047

Please sign in to comment.