Q: What if our creative team resists changing direction based on data?

Jim Miller:

Great question. Over the years, we’ve found many creatives initially worry pre-market testing will stifle their creativity. But it actually frees them up — testing lets them explore ideas without risking damage to the brand. It’s about experimenting in a safe space before going to market, which can spark more creative freedom, not less.

Q: Can you give a real-world example of combining virtual and experiential testing?

Shannon Anderson:

Sure. We’ve worked on direct mail projects where Jim’s team first ran virtual tests to refine messaging. Once we honed in on the best messages, we created multiple physical direct mail designs — very different from each other, but with the same info — and gave those to people to open and interact with.

Using eye tracking on the physical mail, we learned things like: If info was in a block on the side, people read it all; but if it was in bulleted lists, attention dropped off after the second bullet. The messaging was optimized first virtually, then validated and fine-tuned in real life.

Virtual testing is great for messaging, imagery and layouts, but it can’t test physical elements like paper type or card thickness. Experiential testing lets people touch and feel those differences. Using both together gives a full, reliable picture of what will work in market.

Q: You talked about pre-market testing for direct mail and in-store packaging, but is it applicable to digital campaigns as well?

Jim Miller:

Absolutely. The digital channel is definitely easier to test because you can create content and get it out there much faster than many other channels. But speed alone isn’t enough. If your content doesn’t resonate or is irrelevant, even fast testing won’t lead to success.

So yes, pre-market testing is absolutely applicable to digital. It’s less about speed and more about reaching relevant content for your audience faster and more efficiently.

Q: What are some of the most surprising consumer behavior patterns you’ve uncovered recently?

Shannon Anderson:

Oh, we have so many fun ones. One that stands out was a project we did several years ago with orange juice cartons. We tested four or five very different designs — and one was bright green. Now, think about the typical orange juice fridge and picture a green carton sitting in there. It got tons of attention — people spent a lot of time looking at it — but no one actually bought it.

It was fascinating because it went against what we usually see: grab attention, convert quickly. Here, people were intrigued but confused, and that led to rejection. It was a great lesson in how something that stands out doesn’t always mean it works for the product or the category.

Q: How does Accelerated Marketing Insights compare to AI models?

Jim Miller:

AMI — our Accelerated Marketing Insights platform — is our pre-market testing solution. Comparing pre-market testing and AI is a great question, and they serve very different purposes.

Pre-market testing doesn’t create content; it validates it. AI today is mostly about generating content. The best approach? Use AI to create and develop your content, then use pre-market testing like AMI to validate that creative before you invest dollars and time in market.

That validation step is critical. As we mentioned earlier, validated content can increase success rates sixfold, which has huge implications for budget and timing.

Shannon Anderson:

I’d add that while AI can analyze visuals — like telling you which beer label stands out on the shelf — it lacks the ability to understand shopper motivation. AI can’t tell you which product is the best seller or why people are drawn to it. That human motivation piece is still key in understanding real consumer behavior.