Why No One Cracked UA Creative Testing?
The sun has set on San Francisco's good old days. Before, you would hear a gunshot every hour and half-naked crack addicts chased you down the street.
I sat in an office of a large game company which we will not name.
The creative team presented a report of massive length, graphs, charts, tables and text of the length of the New Testament, or the Bible, whatever is longer.
"This is our report and based on all the signals we got from our testing in New Zealand, this creative concept is going to perform much better with the target market players"
They were trying to move from pre-launch to global launch and were preparing for making the UA creatives for the global launch. All signs from testing showed a certain creative performed much higher.
"Killer!" the VP of UA said, and asked the team to launch the campaigns with this concept.
A few months later, that game was sunset. The creative concept flopped, and they were unable to acquire users at a CPI that made sense, 2x higher than expected. With no clear ROAS they couldn't continue to fund user acquisition and the game quickly died out.
When I chatted with the creative director some time after that I asked what she thought about that failed launch and why results were so different than testing.
She told me: "Oh. We don't test to actually find the winner. Who knows what will work. We tested to not get fired with the rest of the team. We had data that supported our decisions".
The truth about testing creatives
It doesn't truly work anymore. At least not consistently. Who knows how well it worked in the past and how much of it was testing theater. Performing a show that is more about politics than actual performance.
I'm not talking about all forms of testing in general. I'm talking about taking your UA creatives and testing them somewhere "safe" with allegedly some correlation between the audience in that country to your target tier-1 markets.
In this era of user acquisition - you don't know who is the audience that is seeing your ads at all. Not to the extent that you can control that variable and really perform a valid test.
Which means, and I don't need to convince anyone, that the winning concept creative in an a/b test in some country has almost nothing to do with what will be the winning concept in actual BAU (business as usual) main campaign.
We're not in the business of figuring out the most valid testing methodology. We're in the business of winning.
Leave testing theater to other teams
So if you're interested in winning, hitting your spend and ROAS goals, we hope you don't really care about testing methodologies unless they can be proved to be an indicator of performance.
The risks of doing any form of sandbox, or almost-sandbox testing are massive.
It leads teams to throw out promising concepts out the door.
The more alarming reality would be that you are missing out on concepts that didn't win your tests, but would have become top share-of-voice creatives in your BAU campaigns.
The environment has changed which shifted where creative testing should happen
The ad networks themselves have changed as well. SDK networks like Applovin are sophisticated machines that are perfecting the science of finding valuable audiences for creatives.
In order to achieve this, there is massive testing that is happening behind the scenes, continuous sampling of different audiences in different games and game genres and reinforcement learning that helps direct creatives to the placements where the players most likely to install and pay are.
This is not a machine that requires teams to add another testing layer on top of it and feed the machine only "tested creatives". The network's machines are like reflectors. They reflect to you whether a creative is resonating or not by looking at real performance in the environment these creatives will actually be used.
So instead of not only wasting precious time, but also risking killing creative concepts on false signals, we suggest building a process that tests creatives in real-world environments, in this case your BAU campaigns.