Historically, optimizing to reach an ROI goal required a systematic approach to bidding and ongoing testing of ads and keywords. Changes would be monitored for fluctuation in Average Position, CPA, Spend and of course, Conversions. And those that have been doing this know the struggle; hours of reporting, analyzing data, making bids changes, checking on the results and then allowing more time to measure the effect of those changes.
With the age of automation, we have seen many life-improving benefits come from technology. For digital marketers, one of those life-improving benefits came in the shape of Google’s automated bidding.
When automated bidding was first introduced, it was not a perfect science by any stretch. We would often see volume decrease to extremely low levels or cost per conversion skyrocket. But the automated bidding capabilities have come a long way and in many cases, we’re now seeing it beat our manual efforts. Below are the results of a recent test but first, a few notes about Smart Bidding.
What is Google Smart Bidding?
Google Smart Bidding is a type of automated bidding which uses machine learning to bid at the time of the auction to optimize for a given goal. Google’s machine learning predicts the users propensity to convert based on a wide range of signals. These signals include location, time, ad characteristics, browser, operating system and many more. Currently, Google offers a few different types of Smart Bidding:
- Enhanced CPC (eCPC)
- Target CPA (tCPA)
- Target ROAS
- Maximize to Conversions
Smart Bidding goals can be set at the campaign and adgroup level and Google will then make automated bid changes using their algorithm to dynamically change your bid within the auction to adjust your keyword to a position where it thinks you are most likely convert. This process is similar to what an advanced search marketer might do but Google’s Machine Learning uses many more signals than is available to ad managers. These additional signals make a compelling argument for moving to tCPA strategy, so, we decided to test it.
Testing Smart Bidding Target CPA
We had many questions, suspicions, and preconceived notions…
- How good can this actually work if you are no longer conducting keyword-level bid optimization?
- Google will just keep bidding you up creating a leapfrog effect with the marketplace, and benefit from CPC’s being driven up
- We are giving up CONTROL, and handing it over to Google
To help embrace the “why’ of automated bidding, it’s important to examine the concept. Years of manual bidding have taught us many things, one of which was “Location is Everything.” We set bids to reach a specific position knowing that we needed to capture a high share of impressions on the most valuable keywords. Because, after all, those keywords were the key to reaching our target audience and we needed to make sure our target audience saw our ads.
Target CPA bidding takes on a different approach. Think of it as a departure from ‘Location is Everything” to “Right Audience, Right Place, Right Time is Everything.” Based on all of the data it has collected from its platform; Average CPC, Average Cost/Conv, Cost/Conv by Position, Expected CTR and Expected Conversion Rate (among many other factors), Google will make a bid adjustment in the millionths of a second between the time it takes a user to type a query and hit “enter” serving up their search results. Remember, it will serve a different set of ads depending on each user as it matches the right advertiser to the right audience at the right time.
We set up a simple experiment testing manual CPC bidding (control) against a tCPA strategy (experiment.) We let the experiment run against 20 campaigns for 1 quarter and were pleasantly surprised at the results. After testing tCPA bidding across 20 campaigns from Q1-Q2 2018. In this case, the client goal was to hit $20 CPA:
On roughly the same amount of impressions, Google Smart Bidding was not only effective in decreasing CPA (27.3% improvement from the control) but was able to drive a 30.3% higher conversion rate and deliver 30.7% more conversions.
One important thing to point out is that average position actually improved. This is a critical difference because often optimizing CPA results in a poorer average position (a result of lowering bids.)
Numerous other tests were conducted across multiple clients and campaigns with similar outcomes. In fact, Smart Bidding has managed to turn even the most skeptical search managers here into fans. That said, there are still instances where Smart Bidding won’t work. The most typical example is when a campaign has low conversion volume. As with any machine learning application, you have to be able to feed the beast. More data will yield better outcomes as it learns what works and what doesn’t work. We’re excited to see the evolution of Smart Bidding and will keep you posted as we test new features and techniques.