Skip to content

galaxycats/robotstxt

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Robotstxt

Robotstxt is an Ruby robots.txt file parser.

Robotstxt Parser allows you to the check the accessibility of URLs and get other data.

Full support for the robots.txt RFC, wildcards and Sitemap: rules.

Features

  • Check if the URL is allowed to be crawled from your Robot

  • Analyze the robots.txt file to return an Array containing the list of XML Sitemaps URLs

Requirements

  • Ruby >= 1.8.7

Installation

This library is intended to be installed via the RubyGems system.

$ gem install robotstxt

You might need administrator privileges on your system to install it.

Author

Author

Simone Rinzivillo <[email protected]>

Resources

License

Copyright © 2009 Simone Rinzivillo, Robotstxt is released under the MIT license.

Releases

No releases published

Packages

No packages published

Languages

  • Ruby 100.0%