Thursday, April 23, 2015

Experimental Design and AdWords

As the close of my team's AdWords campaign has closed, I would like to reflect on the campaign in the scope of our experimentation. As part of the Google Online Marketing Challenge, one stipulation of the competition was that we had to alter at least one aspect of our campaign within the 17 days the ads ran. We altered many parts of our campaign, including adding new keywords, changing the titles of our ad extensions, and increasing our budget (due to the shortening of our campaign time). The action we took that mostly relates to experimentation, however, was the addition of a third ad.

Our original ad group involved Ad #1 and Ad #2, which both had identical titles and destination URLs. There was only one difference between the ads: Ad #1 had a call-to-action in the second line and Ad #2 had a unique value proposition. After the first few days of monitoring the AdWords account, it became apparent that Ad #1 was preforming much better, at 98% served, than Ad #2, at 2% served. Our group then decided to add Ad #3 to the group, which still had the same title, destination URL, but had the first line meta description from Ad #2 and the call-to-action as the second line meta description from Ad #1. This ad also quickly surpassed Ad #2 in percentage served, and was on trend to be as successful as Ad #1 (I do not have the metrics at this time unfortunately).

This action correlates with this TED-Ed video and two articles given in class, "Finally, A Majority of Executives Embrace Experimentation" and "How to Design Smart Business Experiments." The three resources all shared an underlying theme - the idea that too many businesses implement new features to their business based on "gut feelings," and should instead turn to a "test and learn" mindset. While there are many experimentation methods out there, like prototyping, stimulation, randomized clinical studies, and epidemiological studies, the one that related most to our digital marketing objectives was test groups and A/B testing. The actions taken in our ad campaign represent an A/B test because it tests two live versions of the same ad and is tested on the success of the higher performing version.

In the future, I would like to use this type of experimentation on differing types of digital media. I think this type of live testing in which you can immediately garner data to determine what is performing best is very interesting. In a technologically driven world in which consumers expect the marketer to understand exactly what they would like, this kind of experimenting offers immediate results and is determined by the same customer that you are targeting.

No comments:

Post a Comment