webrobots

This is a library to help write robots.txt compliant web robots.

Usage

require 'webrobots'
require 'uri'
require 'net/http'

robots = WebRobots.new('MyBot/1.0')

uri = URI('http://digg.com/news/24hr')
if robots.disallowed?(uri)
  STDERR.puts "Access disallowed: #{uri}"
  exit 1
end
body = Net::HTTP.get(uri)
# ...

Requirements

Contributing to webrobots

Copyright

Copyright (c) 2010, 2011, 2012, 2013 Akinori MUSHA. See LICENSE.txt for further details.

[Validate]

Generated with the Darkfish Rdoc Generator 2.