Made you look! Launch smarter with media that’s proven to resonate
How agile testing can meet the evolving landscape of consumer attention
May 30, 2025
Welcome
Path to Purchase Institute Moderator:
Hello everyone, and welcome to our webinar, Made You Look: Launch Smarter With Media That’s Proven to Resonate.
Let’s frame up today’s webinar.
Today’s presentation will explore the evolving landscape of consumer attention, the power of nonconscious decision-making, and the need for agile testing. Our speakers will showcase how brands can test the way customers shop and how they can use analytics and neuroscience-based technology to predict engagement.
Without further ado, our speakers today are Jim Miller, Vice President of Marketing Solutions at Accelerated Marketing Insights by Quad, and Shannon Anderson, Director of Client Research at Accelerated Marketing Insights by Quad.
All right, Jim, I’ll turn things over to you to get us started.
Jim Miller:
Great, thank you so much, Jackie.
Quad is a marketing experience company dedicated to helping brands improve performance across all marketing channels. The content we’ll be discussing today is grounded in Quad’s unique perspective and innovative techniques, developed through years of real-life work with hundreds of brands — all with the goal of helping those brands grow their business.
Much of what we’ll cover is centered on leveraging real-world behaviors through specific, pre-tested media measurement. These comprehensive testing methodologies can deliver powerful insights that often go beyond initial expectations and can be applied across digital, direct, packaging — really, across all media.
Whether you use virtual techniques such as trade-off analysis or eye tracking, or experiential techniques like physical prototyping or impulse buying, these practices all help predict live purchase behavior before incurring the time and expense of full market testing. Just as important, they increase the likelihood of success right out of the gate.
Shannon and I plan to cover a number of topics today, including:
- The universal challenge of how to obtain better results faster
- Thinking beyond just A/B testing to capture deeper insights
- Letting data uncover audience motivations
- Leveraging real-world behaviors to identify those insights
We’ll wrap up with a real-world case study where a client used a full, comprehensive strategy — including both virtual and experiential pre-market testing — to inform their go-to-market approach. And of course, we’ll leave time at the end for a Q&A session.
Jim Miller:
Let me pause and ask: How are you feeling as a consumer?
There’s so much happening right now in the marketplace — economic uncertainty, rising prices. It’s making many of us rethink how we spend our hard-earned money.
Like most people, the first thing I do every morning is pick up my phone. I’m immediately flooded with emails and ads from companies vying for my attention. Later in the day, I’ll walk my dog and check the mailbox — it’s filled with packages, postcards, flyers.
As a consumer, it can all feel a bit overwhelming. There’s just so much content coming into my home. And yet, I still welcome information about products and services that align with my wants and needs.
So it’s not the volume of media that bothers me — it’s that so much of it feels irrelevant.
Now, I want you to put your marketing hats back on. Let’s take a step back.
Our challenge — and our opportunity — is to understand what really matters to our audience. What kind of content do they want? What should the package look like? How should the copy read? What should we highlight in that messaging?
That’s the magic we’re all chasing. Because when we get it right — when we deliver content that resonates and engages — we know our performance metrics will improve.
That’s what we hope to provide today: A perspective on how to think differently about testing, no matter what media or mix you’re deploying. The goal is to help you move the needle on that universal challenge — how to get results faster.
Jim Miller:
Now let’s do a quick exercise.
What you’re seeing on the screen are eight postcards, shown front and back — top over bottom. The assignment with this client was to develop a concept that would outperform the current control in market. You can see the top-left card is the existing control. The goal was also to improve conversion among current customers who own a second vehicle, encouraging them to bring in that second vehicle for service as well as the first.
Each design you see met brand standards and featured the exact same offer. I want to give you a moment to take a look and, on your own, decide which — if any — might beat the control or perform best in market. I’ll give you about 10 seconds.
[Pause]
Hopefully, you’ve picked the one you think would win. Now let’s walk through the actual in-market results.
And the winner is: “Two cars, double the deals, too easy.”
Now look, everyone on this call is, to some degree, a pro. I’ve done this same exercise nearly 100 times, and on average, about 80% of participants guess incorrectly. Even the client couldn’t accurately predict the results.
The point is this: Professional experience alone consistently fails to identify the top performer. That’s why we need to let the data guide us. The data tells us what will resonate most with our intended audience.
So how do we find the right data? Especially when we’re talking about creative?
It’s not demographic, geographic or transactional data. We’re talking about something different — data that informs creative direction.
And the answer is pretty simple: We test.
Now, I’m confident most of you already have a robust testing strategy in place. I’m also confident that A/B testing is part of that mix — and it should be. It’s one of the most accurate ways to test, but it’s also one of the most expensive. Each test cell needs to run at a statistically significant volume, which drives up costs. And with channels like direct mail, where packaging is involved, those costs can really balloon.
Another hidden cost of A/B testing is time. Even if you’re getting incremental lift, it takes a long time to reach that next benchmark.
Our anecdotal research at Quad shows that A/B testing replaces performance benchmarks just 5% to 15% of the time — regardless of industry, product or channel. That’s not exactly inspiring.
So what’s the alternative?
Shannon and I are strong advocates for pre-market testing because we’ve seen it predict real-world behavior again and again. At Quad, we use both virtual and experiential pre-market testing — and the results are incredibly compelling.
Over the past seven years, when clients use these methods before going to market, we achieve successful, benchmark-replacing results about 85% of the time. That’s a 6x improvement over A/B testing.
Think about it this way: if your typical testing cycle takes three months, A/B testing alone could take 18 months to achieve the same breakthrough you might get from one round of pre-market testing.
That’s a year and a half of waiting — versus three months of insight and immediate gains.
To me, that’s pretty compelling.
Shannon Anderson:
Thank you, Jim.
So how do we go beyond A/B testing? Because while it tells us what works better, it doesn’t tell us why.
That’s where experiential testing comes in. It helps us understand the “why behind the what.”
We set up in-context testing environments and create small experiments to watch real behaviors unfold.
Take this example we’re showing on screen. The concept is: your package is your product. Whether it’s your catalog, your mailer or your display ad, the way it looks directly represents your brand.
We ran a simple test with cereal shoppers. First, we asked participants to find their favorite cereal on the shelf. Pretty straightforward — they could do it easily based on the packaging.
Then we ran the exact same test, but this time we stripped away the packaging. We put the cereal in plain bags labeled only with the name — and asked them to shop again.
What we found was clear: shoppers weren’t looking for flakes — they were looking for boxes. The packaging is the product.
With tools like eye tracking and heat mapping, we could actually see how people processed the shelves, what grabbed their attention and how their decision-making changed.
This is the kind of insight we’re after: What really drives behavior?
And like Jim mentioned, one of the most common objections we hear when we talk about this kind of testing is, “I’d love to do this — I just don’t have time.”
But if pre-market testing gets you to your goal six times faster, the time argument starts to fall apart. You could be generating incremental revenue gains right now instead of waiting a year and a half for A/B testing to deliver a breakthrough.
Shannon Anderson:
The reality is, if you don’t have time to test before going to market, you definitely don’t have time to fail once you’re there. You launch something based on gut instinct, and if it underperforms, you’re right back at square one — spending more time, more budget and more resources to fix it.
But taking even just a month up front to test can completely change the game. It gives you confidence and clarity. Like Jim said, let data do the heavy lifting. That’s what we focus on: using neuroscience-backed testing that blends traditional research — like surveys and interviews — with in-context consumer behavior.
We look for the intersection between what people say, what they do and what they think. That’s the sweet spot.
We use Tobii 3 eye-tracking glasses, which capture 50 eye movements per second. So if someone shops a planogram for 30 seconds, that’s 1,500 data points — per person. It means we can quickly build a statistically valid dataset. We can test 30 people in half a day, run a round or two and usually have results in about two weeks.
So that whole “we don’t have time” thing? It kind of falls apart.
Shannon Anderson:
We also find answers to questions we didn’t even know to ask. A couple of years ago, a client was testing a shift from plastic to paper packaging. While they were at it, they tested both a natural kraft paper bag and a bleached white version.
The kraft bag not only grabbed more attention, but in post-shop surveys, 40% more people said they’d be likely to recycle it. So it wasn’t just visually more appealing — it actually improved the brand’s sustainability perception.
That kind of insight is gold. You’re not just choosing a bag — you’re tapping into consumer values and buying behavior.
And like we’ve said, your media is your product. That goes for packaging, catalogs and direct mail, too.
We did a study earlier this year that brought parents and kids into the lab together to test a toy catalog. We eye-tracked both groups because we wanted to see how they look at things differently.
Parents told us price was the biggest factor. But what we saw was that kids looked at the price before the parents did. It was the fifth thing they noticed, versus the ninth thing for parents. Why? Because kids have to sell their parents on the item.
And here’s the kicker: parents who shopped with their kids spent $87 more. So maybe leaving the kids at home doesn’t actually save you money after all.
We even discovered that kids tend to look at the right-hand page first. So if you really want to grab their attention? That’s where your top product should go.
This kind of testing moves us beyond “Is this better or is this better?” It gives us the why behind the behavior.
Jim Miller:
Those are great examples, Shannon — thank you. I want to take a moment to talk about the virtual side of pre-market testing and how it compares to the physical side.
Our virtual testing platform is rooted in trade-off analysis. For anyone unfamiliar, it’s a statistical method that evaluates different creative elements — like design, messaging and layout — to determine the best-performing combination.
Jim Miller:
Basically, trade-off analysis shows us how real consumers are likely to behave when presented with specific marketing content. It uncovers subconscious decision-making — giving us insights that aren’t just accurate, but competitively actionable.
And again, this happens faster and more affordably than traditional A/B testing.
Here’s a real-world example: Back in 2000, Porsche faced serious financial challenges. They had to broaden their appeal beyond sports car enthusiasts without alienating that core audience.
So they turned to trade-off analysis. The result? The Porsche Cayenne SUV. It became their top-selling vehicle and arguably saved the company — without losing the loyalty of their base.
And to bring it closer to home, we worked with a Fortune 100 insurance company that had been rotating three control packages in the mail to avoid fatigue. They were seasoned pros — testing often, sophisticated in approach — but none of the controls had been beaten in years.
They came to us to see if virtual pre-market testing could help them uncover a new control. The catch? No latest content. No fresh copy. No new imagery. The challenge was purely structural.
So basically, we took the three existing packages and mixed and matched their content any way we wanted — but couldn’t add any new material. The goal was to see if we could uncover a fourth package that would perform as well as the current three. Then, they wanted to add that fourth option into the rotation to extend the life of the whole campaign.
After running our pre-test study, we recommended a new control package using only their existing copy, imagery and offers. We projected about a 17.3% lift in market performance. The client was skeptical, but they followed through — and the results came back with a 19% lift over all previous versions. They were ecstatic.
Ultimately, they replaced all previous controls with this pre-market tested version and kept building from there. This shows that when you look beyond simple A/B testing and embrace broader pre-test methods, you can achieve better results faster.
Shannon Anderson:
One big concept we highlight in shopper testing is impulse buying — what marketing folks know as system one thinking versus system two thinking. It’s all about capturing attention fast because if they don’t see it, they won’t buy it.
For example, we worked with a health and beauty brand testing new designs on shelves. Initially, their products were shelved near the bottom. When we tested moving those products to the next-to-top shelf, every design performed twice as well. This insight gave the brand real data to persuade the retailer for better shelf placement — and better sales for everyone.
Eye tracking and non-conscious decision data made that possible. That’s the power of combining testing methods into a comprehensive strategy.
We also worked recently with Green Blue, the folks behind the “How to Recycle” label, helping them redesign and optimize the label using multivariate studies and in-store experiential tests.
The goal? Make recycling info clearer and more effective for consumers both in the store and at home. This ongoing partnership combines large-scale surveys with real-world shopper behavior to drive sustainability forward.
Now, let’s open it up for questions.
Q: What if our creative team resists changing direction based on data?
Jim Miller:
Great question. Over the years, we’ve found many creatives initially worry pre-market testing will stifle their creativity. But it actually frees them up — testing lets them explore ideas without risking damage to the brand. It’s about experimenting in a safe space before going to market, which can spark more creative freedom, not less.
Q: Can you give a real-world example of combining virtual and experiential testing?
Shannon Anderson:
Sure. We’ve worked on direct mail projects where Jim’s team first ran virtual tests to refine messaging. Once we honed in on the best messages, we created multiple physical direct mail designs — very different from each other, but with the same info — and gave those to people to open and interact with.
Using eye tracking on the physical mail, we learned things like: If info was in a block on the side, people read it all; but if it was in bulleted lists, attention dropped off after the second bullet. The messaging was optimized first virtually, then validated and fine-tuned in real life.
Virtual testing is great for messaging, imagery and layouts, but it can’t test physical elements like paper type or card thickness. Experiential testing lets people touch and feel those differences. Using both together gives a full, reliable picture of what will work in market.
Q: You talked about pre-market testing for direct mail and in-store packaging, but is it applicable to digital campaigns as well?
Jim Miller:
Absolutely. The digital channel is definitely easier to test because you can create content and get it out there much faster than many other channels. But speed alone isn’t enough. If your content doesn’t resonate or is irrelevant, even fast testing won’t lead to success.
So yes, pre-market testing is absolutely applicable to digital. It’s less about speed and more about reaching relevant content for your audience faster and more efficiently.
Q: What are some of the most surprising consumer behavior patterns you’ve uncovered recently?
Shannon Anderson:
Oh, we have so many fun ones. One that stands out was a project we did several years ago with orange juice cartons. We tested four or five very different designs — and one was bright green. Now, think about the typical orange juice fridge and picture a green carton sitting in there. It got tons of attention — people spent a lot of time looking at it — but no one actually bought it.
It was fascinating because it went against what we usually see: grab attention, convert quickly. Here, people were intrigued but confused, and that led to rejection. It was a great lesson in how something that stands out doesn’t always mean it works for the product or the category.
Q: How does Accelerated Marketing Insights compare to AI models?
Jim Miller:
AMI — our Accelerated Marketing Insights platform — is our pre-market testing solution. Comparing pre-market testing and AI is a great question, and they serve very different purposes.
Pre-market testing doesn’t create content; it validates it. AI today is mostly about generating content. The best approach? Use AI to create and develop your content, then use pre-market testing like AMI to validate that creative before you invest dollars and time in market.
That validation step is critical. As we mentioned earlier, validated content can increase success rates sixfold, which has huge implications for budget and timing.
Shannon Anderson:
I’d add that while AI can analyze visuals — like telling you which beer label stands out on the shelf — it lacks the ability to understand shopper motivation. AI can’t tell you which product is the best seller or why people are drawn to it. That human motivation piece is still key in understanding real consumer behavior.
Made you look! Launch smarter with media that’s proven to resonate
Collin Delrow2025-06-02T10:59:27-04:00May 30th, 2025|On-demand webinars|
On aisle time: Tapping into non-conscious decision-making
Collin Delrow2025-05-22T17:21:57-04:00May 22nd, 2025|Marketing guides|
How marketers and publishers can cope with postal rate hikes and tariffs
Collin Delrow2025-05-08T14:16:13-04:00May 8th, 2025|Insights|
On aisle time: Tapping into non-conscious decision-making
The science behind what influences in-store choices and how brands can harness it.