Unfortunately, I felt the need to write one more posting about Facebook advertising. It is unfortunate because our business is about apps to help people change their lives for the better, not about advertising, and I hate wasting time. However…
When I read Rory Cellan-Jones’ partially correct and rightfully surprised BBC article about Facebook advertising last night and then Mike Butcher’s more pithy but somewhat less thoughtful insider TechCrunch dismissal of his findings, I decided I had to follow up.
The short summary of the problem is that some people who use Facebook ‘like’ huge numbers of things with relatively little discrimination. It is because they use Facebook more like most of us use Twitter (or like teenagers used to use MySpace before the fall), and the new Facebook interface encourages them to do it. I call them ‘booklicants‘ because at first I thought they were fake profiles, just like the BBC did. But Rory Cellan-Jones is wrong, and so is Mike Butcher, just like I was.
I have previously written about how to avoid the problem the BBC found, how this problem may represent a material financial risk for Facebook, and how Facebook knows about the problem and even exacerbated it with their new user interface and yet has a policy of placing the onus and blame on their advertiser customers.
So far there aren’t very many of these booklicant people in the scheme of Facebook’s sea of hundreds of millions. From what I can tell, they tend to exist in countries where the new Facebook interface has been in place the longest, so I suppose we should expect to see more of them in the US, UK and so forth over time. However, they already represent a VASTLY disproportionate percentage of the clicks paid for by advertisers compared to their population.
Facebook knows they exist because 1) I told them, 2) they recently thwarted a class-action lawsuit on behalf of advertisers that is now under appeal and 3) they seem to have an algorithm that invalidates clicks after a booklicant likes a truly huge number of things, but doesn’t retroactively invalidate all the clicks by that user. If my analysis on that last point is right, then it means they are knowingly taking money from advertisers for clicks that their own internal algorithms show are almost certainly invalid.
This problem does not hit most advertisers very hard on a percentage basis, but unfortunately it hits smaller advertisers hardest. That is not only because they may be less sophisticated and do a poorer job targeting ads like the BBC did – Mike Butcher is right about that, even if he is wrong about the nature of the underlying problem. It is also because they spend less money.
If you only have $1000 to spend, even if you target it carefully, many of the first ‘likes’ you will get are from people who like a lot of things not because they actually like the ads or the company but because 1) they like to collect badges and 2) they believe it will change the stream of content they receive. Both are reasonable reasons to click from a user perspective, but very low value from a small advertiser perspective. Unfortunately, the small advertiser has to pay the same price for those low-value clicks as they do for the more meaningful clicks that will come later as the campaign progresses. And even worse, the small advertiser will see their click-thru rate drop very rapidly after the first couple days as those click-happy ‘booklicants’ exhaust themselves. That encourages the advertiser to refresh the ads or refine their targeting, which gives Facebook a chance to bring in a new set of users who include some ‘booklicants’ who will like the ads quickly as well. The result is that if you spend less than $1000 on Facebook advertising in most any demographic but especially in countries where the new interface has been around for a while, you will be wasting a meaningful percentage of your budget on people who don’t really like you the way you think they do.
Big advertisers pay ‘too much’ for those early booklicant clicks as well, but their campaigns are so large that it isn’t a big deal for them. Plus, like Mike Butcher, they are jaded and already “know” there is a lot of click-fraud in the world…they’ve seen it all before with Google and SEO scams and such.
However, this is different, and folks like Mike Butcher don’t really “know” as much as they think. In this case, Facebook has nearly perfect information, since these users have to log in, and most of their clicks are tracked. That means Facebook knows full well that there is a class of real users who click a lot, and their algorithms do seem to attempt to trim some of those clicks, even though they don’t invalidate clicks retroactively. It doesn’t seem to constitute fraud, since all these clicks are from real people. However, it is certainly misleading to advertisers to charge the same price for clicks from people whose behavior is dramatically and consistently different from other users in a way that their own algorithms are designed around. That is especially true when your algorithm only invalidates some clicks from these people. And it is even worse if it is a practice that has gone on for a long time and is getting worse, as it appears to be.
I validated most of this by proving I could generate about 5,000 likes for our Facebook page for a little less than $200, or about $0.02 a click. The population of people I collected was essentially the same as the people I collected when paying $0.20 per click. The method below also seems to work in the US and the UK, though at higher CPCs. Here is how I did it:
- Create an ad targeted however you want and pay the CPC bid that Facebook recommends. It seems you have to do this for Facebook to give you inventory in the first place.
- Once it is approved and has collected a few clicks, drop your bid by 50% or so, wait until you get a few more clicks and watch as your click-thru rate rises.
- After your click-through rate either doubles or goes above 0.1% (whichever is higher), then drop your bid to 90 to 95% of your original bid.
- Watch as the clicks keep coming in for a few days, and your click-thru rate climbs perhaps as high as 2% or more.
- Over time, your impressions will decrease. When the number of impressions per day drops below 10% of the rate you saw in Step 2, then raise your bid again to the recommended range to get more impressions.
- Repeat Steps 2-5 a few times. If you create variations of your ad that use the same copy and image but slightly different targeting rules you can get even better results. This may take a couple of weeks.
- Once the ads seem “burned out,” then put your bid back to the normal range and let the campaign run as long as you like.
I call it the ‘Facebook Flush’ because if you follow these steps, you will have flushed out all the booklicants in your segments for a low price, and then your ads will be running against normal people who don’t click on ads very often. It will feel a bit depressing after your awesomely high click-thru rates and cheap prices early on, but that’s reality.
I think Facebook should do this automatically, so let me end by sharing the last customer support email I received from Facebook about all this after my earlier blog posting:
From: The Facebook Ads Team
Date: Tue, Jun 12, 2012 at 9:06 AM
Subject: Re: Help with Your Ads Manager
I hope you’re doing well. I reached out to an appropriate team to look into.
I understand your concern regarding your Facebook Ads. However, we aren’t able to pursue this any further without detailed click logs from you. Please contact your web hosting company directly if you have questions about how to obtain these server logs.
Our records show that you’ve only been billed for valid clicks. Your server logs will show us what you’re seeing on your end, which, when compared to the traffic we’re sending to your site, will help us identify whether any invalid activity is occurring, and help us clarify for you why you may be seeing discrepancies if not. Without this data, we simply aren’t able to give you any more information about the clicks you’re concerned about.
Please note that these investigations take a significant amount of time so it could be weeks until we have any updates after you send us the server logs.
Plus, it actually make sense that people who already like a lot of pages will continue to like more pages. Kind of like some users tend to click on ads more than others.
Global Marketing Solutions
Yep, you are right, Neil, and so are the MBAs and/or lawyers who wrote and/or reviewed this response before you sent it.
The fact that they again asked for my logs when the entire problem is on their servers in their app is completely unforgivable in my mind – if I didn’t know better, I might have wasted time trying to comply with their request, or more likely just given up. When I look back to my experience running Acrobat.com as we rapidly grew to millions of users, we would never have treated a customer that way. So, instead of just giving up or giving in, I came up with the Facebook Flush and shared it with y’all.
But Rory and Mike, you guys don’t have it right. So please do a little more investigating, perhaps in partnership since Rory started the investigative work and Mike has the industry expertise. Facebook will probably tell you they can’t share their algorithms because they are theoretically sensitive for security reasons. They may also claim they can share them because of an ongoing legal case. Don’t give up, keep asking questions.
For instance, Rory, how many ‘likes’ did all of your fans generate for other advertisers, what is the likely price paid for all those clicks, and thus what is the total money that Facebook collected based on the unusual behavior of those folks? How many of these people exist in the world? I made my own guesses here – I am probably wrong, but the number is big either way.
And Mike, figuring out that number would be a real story no matter what you think of the BBC .