An Analytics Renaissance is Coming

AI Won’t Fix Your Analytics. Building a Decision Engine Will.

At the heart of the current shift in technology is a simple but powerful idea: what we can measure, AI can master. When you measure right and aim true, AI wins. Clarity is the unlock.

We see this in games like Go and chess, where models outperform humans because the objectives are clear, the feedback is instant, and the rules are defined. Business, of course, is not so tidy. It’s noisy, shifting, and deeply human. But even here, an era is emerging where AI won’t just assist with analysis; it will actively participate in the decision-making loop. The companies that rewire their analytics for this new reality will outlearn, outbuild, and ultimately outcompete the rest.

The Hidden Bottleneck

For years, companies have spent millions on data lakes and dashboards, and now they’re assuming this gives them a head start in the AI race. The reality is that most of that data was never designed for AI. It was built either to run applications or for humans to slice and dice in reports. This is a key reason why so many AI initiatives stall out: the foundation isn’t ready for them.

This has left most analytics teams stuck in a reactive cycle. A stakeholder asks a question, an analyst pulls data, and the answer goes into a deck. This loop is slow, brittle, and designed for human conversation, not machine action.

But that’s starting to change. A quiet revolution is underway, driven by AI-powered workflows. A new kind of analytics stack is emerging, one that turns static metric trees into living systems. In this world, dashboards aren’t endpoints; they are starting points. AI agents don’t just surface insights; they diagnose issues, generate hypotheses, and propose interventions. And over time, they will do more than propose. They will act.

The Unchanging Foundation: Why Core Analytics Still Matters

Even as AI becomes more capable, the fundamentals of good analytics are timeless. The basic purpose of data, to measure the business accurately, is still table stakes. The real shift is moving from data built for human reporting to data built to power automated action. The latter demands a whole new set of qualities. It all starts with the four basics of any reliable data system:

  1. Measure what matters, not just what’s easy. Don’t just count clicks. Measure what truly reflects progress toward your goals.
  2. Get the numbers right. Your data must be clean, your definitions unambiguous, and your team aligned on what each number means.
  3. Understand how things connect. Know how your metrics influence one another. If one goes up, what else should move with it, and why?
  4. Know which levers actually matter. Distinguish the actions that cause change from the surrounding noise. You can’t improve what you don’t understand.

These steps are the bedrock. But data built only for reporting can get away with being slow and reliant on a human to provide the final layer of context. Automated systems are far more demanding. To power intelligent agents, your data needs its business context encoded so a machine can understand it. It needs to be easily navigable so an agent can connect dots across domains. And often, it needs to be in real time, because an insight about an abandoned cart is useless five minutes later.

This is where building out your intellectual infrastructure becomes critical. This includes things like comprehensive metric trees, a clear set of guardrail metrics (the things you can’t break), and a registry of your core user segments. Think of it as creating a structured map of your business logic. This helps your team:

  • Align on what success looks like.
  • See how actions tie back to outcomes.
  • Spot problems with greater precision.
  • Give AI tools a clear, rich, and structured target to optimize.

In the past, analysts had to trace these connections by hand. Now, you can build a system where AI agents can move inside that logic because you’ve encoded the rules of the game.

Rewiring the Loop: An Example

Let’s make this real. Imagine you’re on the data team at a fast-growing consumer tech company. You have dashboards and talented analysts, but insights take too long, and teams are asking you to “tell us what to do,” not just “what happened” and help us be “more strategic.” A tale as old as time.

Now, imagine retention for a key Gen Z cohort suddenly starts dropping.

In most companies, a slow, manual process kicks off. Someone eventually notices the drop in a dashboard. Analysts scramble to segment the data. The team debates causes. An experiment might get designed weeks later. The whole cycle is inconsistent and relies on heroic effort.

But it doesn’t have to be your ceiling. The alternative isn’t magic; it’s about deliberately designing your systems for AI consumption from the start. The individual tools like anomaly detection and root cause analysis aren’t new. What’s new is our ability to integrate them into a cohesive, high-speed workflow. The core idea is simple: the more of the process you can instrument, and the more context an AI can consume, the more of the execution it can own.

Here’s how that same scenario could play out, step-by-step:

1. A metric drops.

  • The Conventional Workflow: A human notices a dip in a weekly dashboard, days late.
  • Designing for AI Consumption: The metric tree is instrumented with automated anomaly detection. The goal is to pipe structured alerts into operational channels (like Slack) with clear ownership, creating a real-time signal a machine can intercept.
  • The Near Future: Agents will monitor these signals continuously, surfacing urgent issues with pre-built context on business impact.

2. The issue is traced to a group: Gen Z iOS users from TikTok.

  • The Conventional Workflow: An analyst runs manual segmentation with inconsistent definitions.
  • Designing for AI Consumption: Knowledge is codified in a segmentation registry. This makes the business context of “who your users are” explicitly available for automated systems, translating human knowledge into a machine-readable format.
  • The Near Future: Agents will reference this registry to instantly isolate affected groups and route the issue.

3. The root cause is found: a drop in quiz completion during onboarding.

  • The Conventional Workflow: Funnels are inconsistent or built ad-hoc.
  • Designing for AI Consumption: The team maintains clean, canonical funnel instrumentation. This makes critical user journeys permanently legible to machines, providing a stable map for an AI to analyze.
  • The Near Future: Agents will analyze drop-offs, correlate them to recent changes (like code deploys), and suggest likely causes.

4. Experiment ideas are proposed.

  • The Conventional Workflow: Teams brainstorm from scratch, relying on scattered documents and memory.
  • Designing for AI Consumption: Experiment memory is centralized in a structured, navigable knowledge base. This turns past learnings from conversations into a persistent asset an AI can query.
  • The Near Future: Agents will search this repository to suggest interventions with the highest probability of success.

5. An experiment is designed and scoped.

  • The Conventional Workflow: A slow, manual back-and-forth of writing docs and tickets.
  • Designing for AI Consumption: Reusable templates embed best practices, creating a structured input that is easier for a system to parse and eventually generate.
  • The Near Future: Agents will draft design docs and tickets automatically, pulling in all relevant context.

6. The experiment is monitored.

  • The Conventional Workflow: Manual check-ins lead to tests that run too long or are misinterpreted.
  • Designing for AI Consumption: Automated monitoring platforms with pre-set thresholds turn monitoring into a deterministic, observable system.
  • The Near Future: Agents will monitor results in real time and summarize outcomes the moment a test reaches significance.

7. It works. Now what?

  • The Conventional Workflow: Learnings are lost, and the value of the work evaporates.
  • Designing for AI Consumption: A closed-loop system is in place. A winning experiment automatically updates metrics and roadmaps, putting the outcomes back into the operational flow.
  • The Near Future: Agents will detect the lift, link it to strategic goals, and recommend follow-up experiments automatically.

The Mindset Shift: From Dashboards to Decision Engines

The thread connecting these steps is a fundamental shift in mindset. Many of the best companies today have already moved beyond static dashboards. They have strong, insight-driven teams that operate in a tight, effective loop of analysis, decision, and action. But even this highly effective “human-in-the-loop” model has a ceiling; it relies on heroic effort and scales only as fast as you can hire and train brilliant people.

The next evolution is moving from a human-driven loop to a system-driven one. You are no longer just building assets to support your team’s decisions; you are architecting an engine that makes many of those decisions alongside them. By doing the hard foundational work of defining metrics, segments, and logic, you create a system where AI agents can operate effectively.

This is how you solve the context problem. AI isn’t the hard part. Encoding your business context so a machine can use it is. And if your company is still early in its analytics maturity, that’s not a weakness. It’s a strategic advantage. You can skip the debt of legacy systems and build a foundation for action from the start..

A New Division of Labor: Human Judgment, AI Acceleration

So where does this leave us? The goal isn’t to replace humans, but to eliminate the work that prevents them from being strategic. The work naturally splits into two clear camps.

AI excels at:

  • Monitoring metrics and routing issues
  • Diagnosing anomalies and suggesting causes
  • Generating hypotheses and drafting experiments
  • Writing and monitoring project tickets
  • Synthesizing test results and recommending next actions

Humans still lead on:

  • Defining the right objectives and strategic constraints
  • Interpreting ambiguous or conflicting signals
  • Designing novel strategies from first principles
  • Making complex tradeoffs between competing metrics
  • Building trust, alignment, and influence across the organization

The best analysts of the future will be those who combine taste, judgment, and systems thinking, and then use AI as a force multiplier to accelerate every turn of the loop.

From the Sidelines to the Frontier

For the past few years, I was in venture capital. From the outside, it’s a front-row seat to innovation. You meet brilliant founders and spend a lot of time thinking about and talking about the future. But you don’t build it yourself. Venture, for me, was too much a business of filtering ideas and persuading others to believe in far-fetched stories.

Insight doesn’t come from a thousand slide decks; it comes from a thousand decisions. The feedback loops in venture are long and noisy. You often don’t know if you were right for years. Somehow, 75% of the people you speak with, think they’re going to be top quartile performers.

Operating is different. It’s rigorous, fast, and relentlessly real. The results of your choices show up in hours, not quarters. You build systems, ship products, drive behavior, and watch what moves. And when things don’t move, you fix them. The true frontier is inside companies, in the chaos of real decisions and the daily grind of turning noise into action. That’s where the real learning happens. That’s where this new era is taking shape. And that’s where I want to be.

The Race for Clarity

The promise of AI in analytics is real, but its power won’t be unlocked by a new tool. It will be unlocked by the deliberate, foundational work of evolving your data from a tool for reporting into an engine for action.

The competitive landscape is being redrawn. For the last decade, the winners were the companies best at manually iterating through the insight-to-action loop. But the next winners won’t just be the ones with the sharpest analysts; they will be the ones that build systems to amplify that talent at scale. The race is no longer just for insight, but for the fastest, most intelligent decision engine. For data leaders, this is the moment to flip the script: to stop being a cost center blamed for failed projects and start being the strategic growth engine that makes AI a reality. The future belongs to those who build it with discipline, today.

98 thoughts on “An Analytics Renaissance is Coming”

Leave a Reply to Anabel4836 Cancel Reply

Your email address will not be published. Required fields are marked *

Scroll to Top