Jump to content

leonardteo

Approved members
  • Posts

    1
  • Joined

  • Last visited

Everything posted by leonardteo

  1. Hi guys, I'm currently using Revive Adserver to serve ads for a site. Currently, clickthroughs appear to be highly inflated. I was able to use Splunk to search through Apache logs and find the culprit being bots/crawlers clicking on all the ads on all pages. I want to avoid this in the future, and I'm looking for a list of all (or many) crawler/bot useragent strings as a flat text file with each useragent string on a new line to put into Revive Adserver to ignore. I found a number of sites like http://www.botsvsbrowsers.com/, http://www.robotstxt.org/db.html, http://www.useragentstring.com/pages/Crawlerlist/ but none of these seem to offer a simple list of strings as a flat file that I can just use. Does anyone have any suggestions? Thanks, Leonard
×
×
  • Create New...