Ready, set, test! Creative testing is a fundamental part of any paid search campaign so how can we approach testing, analyse results and drive the best possible outcome for a client account? There are lots of ways to approach testing and a great place to start is by getting organised and developing a structured plan. The plan should include a testing strategy, identifying key outcomes with the Client and use a testing matrix or plan to track, monitor and record the outcome of each test. Without a solid plan testing often falls by the wayside, deprioritised by other more critical tasks. A plan or matrix can help to keep testing on track so regular, effective analysis can take place driving positive performance more consistently. The matrix or plan then becomes your manual for executing the tests in an organised manner.
This year has seen Google introduce the extended text ad format for its AdWords platform creating an opportunity to get started with a fresh round of creative ad testing. With extended text ads rolled out at the end of January now is definitely a great time to invest in a fresh round of creative testing and gain an understanding of how this ad format can yield great results for Client accounts.

Part 1: the groundwork

Outline the strategic approach

A good strategic approach will set you up for success. It helps to clarify what will be tested and set goals or key outcomes for these tests with a Client. Think about aspects like how long tests will run, when results will be reviewed, what device and platform tests will run across (as performance is likely to vary across mobile, desktop, Google, Bing and GDN for example), set expectations with Clients around the communication of results and timelines for the implementation of winning ads. Last but certainly not least, identify the statistical significance you and the Client deem acceptable to measure your tests against, this may be 91%, 95% or 99%.
Furthermore, it’s pivotal to clearly outline the best measure for success that relates directly back to your Clients goals, this is likely to be either clicks, conversions or revenue. By clearly outlining the Client goals in the testing strategy you will know to review click through rate (purely driving clicks) alone or in conjunction with either conversion rate or conversion per impression (conversions) or revenue per impression (revenue). Ad copy has the capability, to an extent, to act as an audience pre-qualifier so any improvements made to CTR% where revenue or conversions increase can offer hints that you are capturing the most valuable consumers in those micro-moments that matter. It’s important to note that conversions and revenue are secondary metrics in creative testing as rarely would an ad have a statistically significant impact on conversion rate. If you are looking to improve the conversion rate or revenue you should be looking at getting started with conversion rate optimisation (CRO).

Develop a testing matrix or plan

An ad testing matrix stems from the strategic approach and can be developed in a few different ways. For example identify the elements in an ad to test then develop creative for each element or generate the creative then fill in the matrix with the elements in each creative that are different to the control ad. Alternatively, look at the themes you are going to test like mentioning brand vs not mentioning brand or price or free shipping and so on. Whatever way you decide to go, be sure the approach you take flows from the testing strategy and each test is focused on one element at a time (A/B testing). This will give you the best chance of having a clear winner and looser and maximise the positive performance you can squeeze out of each test.
Building out a testing matrix or table is really a means of organising all the elements in the ad you want to test and the different variations you wish to test for each element too. An example of what the testing matrix might look like before being filled in is below:

Sample Testing Matrix:

Stef 1

When developing new ad copy for testing be sure to review the landing page and the keywords in the ad group to ensure it is highly relevant, and ensure the quality score is not adversely affected by ads that are too broad. As with any new creative, follow your usual approval process for each Client and ensure you have the green light from the Client before you set any ad tests live.

Part 2: set up, review, implement

Set up the creative tests

It is tempting to set up all the tests in one go, this has the potential to muddy the results and adversely impact the actionable outcomes from each test. Take the time to test one element at a time so you can work systematically through each test, analysing the results, implementing changes thus, maximising the positive performance that can be achieved across your Client’s account.
Campaign drafts and experiments in AdWords, (as of October 2016 these tools replaced Campaign Experiments) helps you to stay on top of testing, by creating a draft campaign which can mirror the selected live campaign settings meaning changes only need to be made to the elements you want to test, in this case the ad creative. The draft campaign or AdGroup can also be reviewed before they are set live. At this stage there are some features and reports that are not currently available for drafts, for detail on this visit Google’s campaign drafts and experiments page in the help centre. Google provide detailed instructions on how to set these up on the ‘set up a campaign draft’ page.

Once the draft is set up be sure to double check the ad rotation settings are on rotation indefinitely otherwise there is a possibility that the test ad may not even show up during the test period (if optimise for clicks or conversions is selected). Once the draft set up has been reviewed and approved (if applicable) its time to set it live by setting up a campaign experiment. To do this follow Google’s detailed instructions on the ‘set up a campaign experiment’ page in the help centre. When naming the experiment in AdWords be sure to add the date the experiment will run (from/to date) and the element you are testing (for example the headline), this will allow you to quickly identify each experiment when it comes time to review the results. It is also recommend to set ‘no end date’ on the experiment giving you the flexibility to run it for as long as required to collect the right amount of data to determine if there is a statistical significance between the control and challenger ads to identify a winner.
If campaign drafts and experiments are not available, for example if you are looking to test ads in another platform like Bing or YouTube, set up the challenger ads for the first element to be tested from the matrix within the relevant AdGroup (remember to only test one element at a time). Make sure you label the challenger ads to make identifying them at the end of the test easier, try either using the ad name or label field to input the start date and element being tested (if there is enough room in the label field).

Finally, identify a testing window based on the volume of data (clicks, impressions, conversions) the campaign is likely to receive. As a guide you might use 1 month for high volume AdGroups and 2-3 months for moderate to low volume AdGroups so you can monitor how much data is accumulated and whether there’s enough to review results or if the test still needs more time. Once you have determined the testing window set up calendar reminders so you don’t forget about them! It’s a good idea to add a note in the matrix with the review dates as a backup, last thing you want is the loosing creative to have a negative impact on your account, totally defeating the purpose of testing in the first place.
Now with all the set up complete, it’s time to sit back and wait for the results to roll in.

Review the data

Let the number crunching begin. Be sure to have the strategy and matrix close by as the goals (objectives) of the tests and ads are already organised here. As you review the data it’s a good idea to place the results into the matrix excel sheet so that all the data is stored in one place making it easy to track the progress and be ready just in case the Client or your manager asks you for an update. Depending on how and where you set up your tests will mean the data can be reviewed in one of two ways. Either you will log into AdWords and navigate to the ‘all experiments’ tab or download ad ‘Ad report’ from the platform and export this into excel.
To review the data via the ‘all experiments’ tab (is there an example I can talk to? I’ve not used this personally, and have only used excel in the past.)

Now, before there was AdWords drafts and experiments, paid search mangers relied on excel sheets, tables and formula to manipulate data, analyse results and identify a statistical significance. This was especially true for testing creative themes (having a brand in the headline vs. no brand) more broadly across an account, as individual AdGroup data can be reviewed directly from the platform in most instances with results entered into a statistical significance calculator (there are lots of these out there). In order to determine the creative winners data was aggregated using pivot tables. To review your data this way, especially if you have run tests on Bing or YouTube or multiple AdWords experiments across difference campaigns, keep reading. First download the ad data in an excel file (.xls or .csv is fine). Open the data file and copy and paste this into the same excel file as the creative matrix, be sure to give the tab a logical name (for ease of navigation) in this example the tab was renamed as ‘Ad Data’. Do a quick check to ensure that the data sheet lists the different sections of the ad (headlines, description lines and display URL) in separate columns as this will follow the same excel logic outlined below. Then on the ‘Ad Data’ tab insert two columns to the left of your data, and add the column headers Call To Action and Description line 1&2 in A1 and B1 respectively. With cell A1 selected insert a table for the data to sit within by selecting insert table from the menu options. You should end up with something that looks similar to the below. Be sure to give the table a logical name too, in this example ‘AdDataSheet’ was used.

stef 2

Now onto segmenting the data for review, this will depend on the how the creative were set up across the account (broadly at the campaign level or more tailored at the ad group level) and the amount of data collected (for statistical significance testing). As an example here we will take a look more broadly at creative testing across multiple campaigns and from the example matrix we will review test 3 the call to action (headlines are more likely to be a whole line of text whereas the call to action is usually a small section of the complete description so slightly more complex). Your matrix should reflect something similar to the below example matrix, note cells D8-10 are the text we will use as a reference in later formula to find each call to action in our data sheet quickly.

stef 3

In order to find the call to action, even if it is split across the two description lines, we first combine all the text in description lines one and two, hence column B. Now enter the below formula into cell B2 to concatenate the two lines of text in column B in the AdDataSheet table.

=CONCATENATE(AdDataSheet[@[Line 1]],” “,AdDataSheet[@[Line 2]])

The red sections above are the column headers in the table for description lines 1 and 2, an important note to ensure the lines read correctly and will be picked up by later formula a space has been added using ,” “, this will mean that if any of the call to action falls across two lines it will still be picked up in later formula matching.
Next we will search within the newly created complete description for the three call to action’s as listed in the matrix. To do this you can use the formula below and paste this into A2, then copy down the column.

=IFERROR(IFERROR(IFERROR(IF(SEARCH(‘Testing Matrix’!$D$8,’Ad Data’!B2)>0,’Testing Matrix’!$D$8,0),IF(SEARCH(‘Testing Matrix’!$D$9,’Ad Data’!B2)>0,’Testing Matrix’!$D$9,0)),IF(SEARCH(‘Testing Matrix’!$D$10,’Ad Data’!B2)>0,’Testing Matrix’!$D$10,0)),”Not In Test”)

Leaving the cursor in cell A1 select ‘insert > pivot table’ from the menu bar, the table range will automatically list ‘AdDataSheet’ (or the name you gave your data table) and opt for the table to be placed on a new sheet (just remember to label the new sheet once created to ‘pivot table’ for ease of reference later). In the filters of the pivot table it is a good idea to have the campaign and AdGroup so that you can filter, if relevant, the campaigns you wish to review. In this example we are reviewing any ad that mentions the three call to action’s listed in the matrix so no filter is necessary but, they are there just listed as ‘all’ selected, drag and drop the ‘call to action’ to the rows section, then the pivot table will look like the below example:

Example with filters:

stef 4

Next we add in the columns for analysis following the matrix layout being, clicks, click through rate (CTR%), conversion rate (CR%), and revenue/impressions. Clicks can be selected from the available fields and you can drag and drop the field into the ‘values’ section as shown below:

Stef 5

Then we need to create calculated fields to ensure the CTR%, CR% and Rev/Imps are calculated correctly and not using averages (this is what happens if you select the field instead of creating a calculated field. To do this under ‘pivot table tools’ select fields, items & sets > calculated field.

stef blog new image 1

Then name the field > remember there is already an existing field called CTR% so you will need to add a space at the start or make it slightly different to the existing field in order to create another one. Then insert the formula to create the field, so for CTR% its = clicks / impressions, for CR% its = conversions / clicks and for rev/imps its = revenue / impressions. Below is how the table will look now.

Stef 7

Remove the ads that are labelled ‘not in test’ as these ads do not contain the call to action listed in the matrix and you have a summary table.

stef blog new image 2

Once you have a summary table of the results its time to check the statistical significance of the CTR% for each ad where enough data has been accumulated. There are a number of free online tools available a few useful ones are by delegator, and cardinal path, or if you prefer to use excel a great post by Chad Summerhill provides a downloadable spreadsheet you can try.
To use these online tools simply download an ad report, create data tables using your matrix to identify the ads and appropriate elements being analysed, and then simply paste the data required into the necessary tools online. Once you have entered the data you should be able to tell if there is a statistical significance between the different ads, update your matrix and compile the results.

Implement the winning creative

Once all the hard work is done the final and most important step is certainly to implement the creative that won the test or pausing the loosing ads (ending the experiment in AdWords). Either way make sure you complete this step – this will ensure the positive performance you managed to achieve will benefit your Client account and there won’t be an ineffective ad dragging down performance.

Rinse and repeat

If you really want to be able to drive the best results for a Client account, testing should definitely be an ongoing activity as has been mentioned throughout this article. Once a test is complete and the winning creative(s) has been implemented it’s time to start the process again by implementing the next challenger ad(s). To do this revisit the testing matrix or plan and select the next ad(s) to test and off you go.
Being organised and structured in your approach to ad testing will certainly help to drive positive performance over the lifetime of your search accounts and ROI for you Clients. Happy testing.

Key take outs:

Testing should be viewed as an always on strategy for creative and performance improvement, occurring every 30-60 days depending on the amount of data the ads received. At a minimum the top 10 campaigns by traffic volume should have ongoing ad testing, with a review across all other campaigns as sufficient data is received. With Googles testing capabilities have improved in recent times with the introduction of campaign experiments. This has enabled businesses to run ad testing using multiple variations and the ability to gauge valuable insights and view ongoing performance in the platform, saving the need to export data and manipulate in excel. The platform also allows you to easily determine a statistical significance between ads to determine a winner within the AdWords platform.

Plan your tests, review results to find a statistically significant winning creative, implement results and repeat.

If you have any questions please do not hesitate to get in contact.

The following two tabs change content below.

Stefanie Morrison

Business Growth & Operations Manager at Reprise Media Australia
I'm a Business Growth and Operations Manager here at Reprise. I have worked in digital marketing for over 8 years, across a range of verticals, from finance to retail. I am passionate about digital media and developing innovative strategies that help brands achieve business goals and deliver a seamless digital experience to their consumers.