Skip to content

Voight-Kampff is a Ruby gem that detects bots, spiders, crawlers and replicants

Notifications You must be signed in to change notification settings

shooma/Voight-Kampff

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Voight-Kampff

Voight-Kampff is a Ruby gem to assist in user agent detection. Voight-Kampf can easily tell you if a request is coming from a crawler, spider or bot. This can be especially helpful in analytics such as page hit tracking.

Installation

gem install voight_kampff

Configuration

A YAML file is used to match user agent strings to their types.

If you'd like to use an updated list or make your own customizations, run rake voight_kampff:import_user_agents. This will download a user_agents.yml file into your Rails /config directory.

To use Regular Expression matching, just set string_match: regex in your user_agent entry. The first match will be used so you should probably put your Regular Expression entries toward the bottom of the file.

Usage

There are three ways to use Voight-Kampff

  1. In your Ruby on Rails controllers:
    request.bot?

  2. Through the VoightKampff module:
    VoightKampff.bot? 'your user agent string'

  3. Through a VoightKampff::Test instance:
    VoightKampff::Test.new('your user agent string').bot?

All of the above examples accept human?, bot?, browser?, checker?, downloader?, proxy?, crawler? and spam? methods. All of these methods will return true, false, or nil (if it doesn't recognize the user agent).

FAQ

Q: What's with the name?
A: It's the machine in Blade Runner that is used to test whether someone is a human or a replicant.

Q: My browser isn't being matched
A: The list is being pulled from user-agents.org. If you'd like to have entries added to the list please create a new issue or send me a pull request. And if you know of a better source for this sort of data, please let me know.

About

Voight-Kampff is a Ruby gem that detects bots, spiders, crawlers and replicants

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Ruby 100.0%