The Hierarchy of E-Commerce Insights

No one teaches e-commerce teams how to find insights that actually grow a business. Not in college. Not in your MBA. Definitely not in onboarding. You just get dropped into the role and told, “Find the insights that’ll grow revenue.” It sounds simple until you realize that nobody has ever defined what an “insight” really is, or how to tell the difference between one that actually changes customer behavior and one that just looks nice in a deck.

So, we fake it. We stare at dashboards and call them “insights.” We copy “best practices” we saw on LinkedIn and call them “insights.” We ask AI to generate a report, skim it for something that sounds smart, and call that an “insight” too. We’ve confused data with understanding. Because, in reality, a real insight isn’t a number; it’s the story behind the number. “Conversion dropped 12%” is not an insight. It’s a symptom. It tells you that something is happening, but not why it’s happening, what the customer is trying to do, or what got in their way.

Real insights take work. Messy, frustrating, slow work. You have to get obsessed, build a theory, dig through behavior data, talk to customers, and test your assumptions in the wild. Most of those theories will die. But the few that survive, the ones proven through experimentation, become the kind of insights you can confidently build on. Those are the insights that compound. They make you smarter over time.

The Pyramid: A Hierarchy of Insight Quality

If you’ve ever felt like your team is drowning in numbers but starving for direction, it’s probably because you’re operating at the wrong level of insight. I’ve found that most e-commerce teams are somewhere near the middle or bottom of what I call the “Hierarchy of E-Commerce Insights.” It’s a simple way to visualize the quality and reliability of different sources of insight, from pure opinion at the base to validated truth at the top.

At the very bottom, you’ve got the weakest forms of insight: heuristic analyses, AI summaries, and the famous “highest-paid person’s opinion.” These sources are cheap and fast, but they’re built on assumptions, not evidence. They can spark ideas, but they don’t tell you what’s true about your customers. They’re the brainstorming stage of insight generation, not the conclusion.

One level up, you reach what most teams consider “data-driven” territory: user behavior analysis (like click maps, session replays, and funnel drop-off reports) and business intelligence (dashboards full of KPIs). These tools are valuable because they reveal what’s happening, but they still stop short of explaining why. You can see that 70% of mobile users drop off at checkout, but not whether they were frustrated, distracted, skeptical, or confused. You can measure the symptom, but you can’t diagnose the cause.

Then, as you move higher up the pyramid, you enter the realm of fundamental understanding, where customer feedback, surveys, interviews, and user testing start to connect the dots between what people do and what they mean. This is where you start hearing the language of your customers: their anxieties, their motivations, and the gaps between what they expect and what your site actually delivers. It’s where data becomes empathy.

And finally, at the top, you reach the ultimate validation: the winning experiment. The experiment is the moment where your theory collides with reality. You take everything you’ve learned, the behaviors, the feedback, the hypotheses, and you test them in a way that reveals whether they hold up under pressure. A winning experiment isn’t just proof that something “worked”; it’s confirmation that you’ve discovered a truth about your customers’ decision-making. That’s the difference between guessing and knowing.

Why Most Teams Never Make It to the Top

The problem is that climbing this hierarchy takes time, coordination, and the willingness to be wrong. Many teams struggle to progress beyond the middle stage because they get stuck in cycles of reporting rather than learning. Weekly performance updates become rituals of observation rather than discovery. Everyone’s talking about conversion rate changes, bounce rates, and ROAS, but no one’s asking why. Dashboards are tidy, but the reality behind them is messy. Insights live in the mess, in support tickets, reviews, session recordings, and conversations that don’t fit neatly into a chart.

Even worse, incentives often work against the climb. Marketing teams want faster answers. Developers want precise requirements. Executives want slides that look like certainty. But the truth is, certainty only comes at the top of the pyramid, after you’ve done the hard, unglamorous work of combining research, behavioral data, and experimentation into a story that holds up.

How Mobile1st Uses the Hierarchy of Insights

Every engagement we undertake begins with an in-depth investigation that encompasses every level of the hierarchy. The goal is to uncover the kind of insight that almost guarantees winning experiments. I won’t get into the full details of our audit process here, but I’ll share how we use it once we’re working with a client.

For one of our long-term clients, a vehicle parts manufacturer we’ve partnered with for nearly two years, we recently used this process to uncover a significant win.

During our initial investigation, we found that approximately 35% of shoppers left product pages due to a lack of understanding about the installation process. When we dug into chat logs, we found a recurring pattern of questions like: “What else do I need to install this?” and “Does this come with everything required?”

Next, we layered in behavior data. Click maps showed that three of the top ten most-clicked items were accordion sections with detailed product information. That told us customers were actively hunting for clarity. Then we remembered a recent losing experiment: when we removed key product bullet points from the top of the page, conversion rates dropped noticeably. That failure reinforced the same theme: customers wanted more information, not less.

Finally, we evaluated heuristic patterns and design best practices for accordion layouts. The conventional wisdom was to close all accordion tabs by default, making it easier for shoppers to scan and find the section they wanted.

By combining all these signals —behavioral data, customer feedback, past test results, and design heuristics —it became clear what to test next. One of our guiding themes for this client was “make product information easier to find.”

So we set up a simple test:

  • Control: Product description tab open by default.

  • Variation A: The most popular tab (Product Includes) open)

  • Variation B: All tabs closed by default.

Closing all of the tabs turned out to be the winner. It increased conversion rate by 20% and revenue per visitor by 40%.

When we reviewed session recordings of the experiment, the story became obvious. Shoppers were able to locate key product information faster, navigate between sections more easily, and discover complementary products they might need, driving both higher conversion and higher average order value.

This is precisely how the hierarchy works in practice. No single data source provided clear guidance. The win came from combining multiple layers of insight, qualitative feedback, behavioral data, design heuristics, and experimental validation into one clear story about what customers were struggling with and how to fix it.

The Payoff: From Guessing to Knowing

Once you start operating at the top of the hierarchy, everything changes. Your team stops treating optimization as guesswork and starts treating it as an investigation. You stop testing random button colors and start testing theories about human behavior. You stop asking, “What’s wrong with the PDP?” and start asking, “What’s missing in the customer’s understanding that’s causing hesitation?” Those questions lead to changes that move metrics for the right reasons and continue to improve, because they’re built on truth.

Climbing the hierarchy won’t make your job easier, but it will make your results more predictable. It transforms e-commerce from a game of whack-a-mole into a system of learning. And once your team experiences that shift, from reacting to understanding, you can never go back.

Most directors of e-commerce don’t have time to do all of this hard work every day. That’s where Mobile1st comes in. We do this hard work to grow revenue per visitor so you can focus on getting everyone on your team on the same page.


Want more?

Make your job easier, let Mobile1st Grow RPV for you

At Mobile1st, we help e-commerce brands grow revenue per visitor.
We do it by combining customer-first research with testing and experimentation that cuts through the noise of dashboards and opinions. Our team uncovers what really drives purchase decisions, then runs experiments to prove impact — so you can stop guessing and start scaling

Want help? Reach out.

Our Latest Episode of Checkin to Checkout with Tom Funk

Want to be featured on Checkin to Checkout?

Send an email to justin@mobile1st.com, and we’ll set up a time to get to know you to see if you’re a great fit. We’re always looking for e-commerce leaders to feature on Checkin to Checkout

Previous
Previous

Our Process for Compounding Revenue Without Increasing Traffic

Next
Next

Why Your Header is Too Big