Skip to content

novotarq/Robots_dot_txt

Folders and files

NameName
Last commit message
Last commit date

Latest commit

e3ad1df · May 23, 2011

History

2 Commits
May 23, 2011
May 23, 2011
May 23, 2011
May 23, 2011
May 23, 2011
May 23, 2011
May 23, 2011
May 23, 2011

Repository files navigation

RobotsDotTxt
============

Installation:

- install as a plugin
- run rake robots:install
- add this route to your config/routes.rb file: match 'robots' => 'robots#index'
- edit the config/robots.yml file
- remove the public/robots.txt file



Example
=======

-- Example robots.yml file:

development:
  firstRule:
      userAgent:
          - google
          - some_other_bot
      allow: *
      disallow: new_post_path
      requestRate: 1/500s
      crawlDelay: 500
      visitTime: 0200-0800
  secondRule:
      userAgent: google-mobile
      allow: /mobile
      disallow: /xx      
      sitemap: /dev_sitemap.txt
  all:
      sitemap: /sitemap.xml



As you can see, you have to specify rules per each environment. You can use YAML lists, also Rails path names. Each rule has a name.


Copyright (c) 2011 novotarq, released under the MIT license

About

robots.txt file generation based on a YAML config file

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages