Rubicon Project tackles ‘botnets and non-human traffic’

Jun 24, 2013 | Online advertising

The Rubicon Project has stepped up efforts to further root out botnets and so-called non-human traffic (NHT). “This is a topic of considerable concern within our industry, and we believe that it must be addressed by the entire industry in a collaborative way,” says Greg Raifman, President of the Rubicon Project. “It’s not a demand-side […]

The Rubicon Project has stepped up efforts to further root out botnets and so-called non-human traffic (NHT).


“This is a topic of considerable concern within our industry, and we believe that it must be addressed by the entire industry in a collaborative way,” says Greg Raifman, President of the Rubicon Project. “It’s not a demand-side problem or a supply-side problem. Everyone in advertising technology has a role and stake here. At the Rubicon Project, we’ve been doing all that we can to reinforce the integrity of our REVV Platform. And our approach, which is intensifying, will remain vigilant, proactive and ongoing.”
The Rubicon Project’s continuing battle against botnets and non-human traffic plays out in several significant ways.
Using Cutting-Edge Analytical Detective Work to Identify Abnormal Traffic
The company has been engaged in cutting-edge analytical detective work – managed by humans and aided by technology. Using various detection methodologies, the Rubicon Project looked for publishers with abnormal traffic patterns.
The process is supported by a set of proprietary algorithms that use a variety of signals to numerically score each impression in real-time; algorithms within Rubicon’s Brand Protection suite; as well as other pattern-analysis algorithms.
Traffic patterns flagged by the automated analysis trigger discussions with publishers, and publishers who cannot adequately explain the abnormal patterns are terminated. The Rubicon Project is also working with publishers who are unknowingly buying botnet traffic to correct such practices.
And the company will continue to refine its detection methodologies within the Brand Protection suite of products, while working with publishers on best practices to avoid non-human traffic.
Leading the Charge Against the Industry’s Bad Actors
“This is an extremely important issue to Rubicon Project, and to me, personally,” explains Jeremy Fain, Vice President of Client Services at Rubicon Project. “While at the IAB, I led the International Spiders and Bots List efforts and oversaw the ad-serving certification for both buy- and sell-side platforms. Here at the Rubicon Project, the quality of our inventory is unsurpassed, because of our constant efforts to detect and eliminate botnets and other non-human activity. When it’s determined that publishers have violated our policies by passing us non-human traffic, we forbid them from transacting or using our REVV Platform. If we suspect a publisher is acting inappropriately, we monitor its behavior carefully, and, if necessary, terminate it as a client. When a buyer reports suspicious activity, we investigate thoroughly and take appropriate action.”
Establishing a Serious Measure of Safety For Buyers and Sellers
Rubicon Project is also playing a major role in the IAB Traffic of Good Intent task force chaired by John Battelle and Penry Price. This influential industry group’s mission is to identify the drivers of botnet and NHT activity and educate the industry on best practices.
“Ultimately,” says Raifman, “the industry needs to identify all technological and business choke points where friction can be added to blunt the effects and incentives of bad actors.
We’re strongly and substantively out front in this area; we’re achieving a serious measure of safety for both buyers and sellers; and we totally support the constructive measures taken by the IAB.”
Building on IAB Standards
Driven by the Rubicon Project’s participation in the IAB OpenRTB & Exchange Infrastructure Working Group, the company’s innovative engineers are collaborating with DSP partners and other exchanges to extend the capabilities of the IAB OpenRTB protocol; a set of symmetric best practices for implementations is also being created.
Under the proposed protocol extension, exchanges and bidders would classify impressions in real time and filter non-human traffic. The exchanges would reject traffic failing classification. The bidders would “no-bid” NHT-classified traffic with a code signalling this to the exchange. And the exchange could then log these signals as inputs back into the classifier.
“This is a modest change to OpenRTB with a powerful potential,” says Neal Richter, Chief Scientist at the Rubicon Project, and co-chair of the IAB OpenRTB & Exchange Infrastructure Working Group, “and that’s why many DSPs and exchanges have committed to supporting this protocol extension.”
www.rubiconproject.com

All topics

Previous editions