By Grant Shirk - February 25, 2021
Another year, another Gartner Magic Quadrant for Analytics and BI tools. The leaders quadrant is again dominated by a few legacy platforms, focused as always on “nifty visualizations” and clean dashboards. But this year, the scoring took a novel turn.
“The real action in the BI market is around augmented analytics, or how well the BI tools incorporate machine learning and AI.”
That’s the takeaway by Alex Woodie, Managing Editor at Datanami. In his teardown of the quadrant, he states that “Augmented Analytics is the new BI Battleground.” That’s a bold statement, and one that we agree with — in fact, we’ve been hard at work building the fastest augmented analytics platform in the industry for years. We know that augmented analytics is the future of not just BI, but how organizations will make daily operational and strategic decisions.
The field of analytics and data-driven decision-making is at a crossroads. Long ago we crossed the point where the speed of compute and the cost of storage limited what we could do with data. Now, the bottleneck has shifted to the rate at which we can ask questions (and how quickly we can point and click through BI tools). To break the bottleneck and put this rich data to work, analytics teams need a new set of augmented analytics tools that can accelerate data exploration, focus our attention on the highest-priority populations, and finally bridge the gap between data and decision-makers.
BI tools have made it easy to visualize moderately complex data – a few variables at a time, aggregated daily, drilling in one or two levels deeper. But with the dimensions available to analytics teams today, there’s no way an analyst equipped with only a dashboard or SQL console can effectively identify the high-impact populations in a dataset amongst billions of possibilities.
Here’s why we think Gartner’s assessment doesn’t go far enough – and why data teams that adopt platforms purpose-built for augmenting the analysis of cloud-scale workloads will steer their companies into market-leading positions.
Augmented analytics: What it means for data teams
As our partners at Andreessen Horowitz illustrated, the overall data and analytics ecosystem has exploded in complexity over the past few years. Modern data infrastructures allow organizations to generate, govern, and put countless data sources to work. But this abundance of information comes at a cost — our ability to prioritize, assess, and understand the variables that really matter when making a decision.
In this world, traditional BI tools are holding us back. Built for simpler datasets and less frequent queries, these platforms are great for reporting and presentation, but when the questions come fast and furious, they can’t keep up with the demands of the business. Analytics teams need to augment their ability to explore complex data with speed and precision, and focus their attention on the highest-priority drivers.
To get ahead of this avalanche of data, analytics teams must get more proactive and more automated. This is why augmented analytics is on the rise, and why you see platforms like Tableau and Power BI dabbling in this area. But a casual, nice-to-have, feature here and there won’t make the necessary impact. Data leaders need new analytics engines that are purpose-built to quickly point their teams in the right direction.
“Differentiation has shifted” to augmented analytics
The biggest sign that this approach is gaining momentum is the admission by Gartner that “differentiation has shifted to how well platforms support augmented analytics.” Whether it’s helping analysts build richer, higher-quality datasets, automating data exploration, or improving how they deliver recommendations, the focus is on helping data teams get beyond the dashboard to real-time recommendations when metrics change.
This matters when today’s data teams are tasked with diagnosing more metrics, tracking more variables, and making clear recommendations in real-time. Take for example the Customer Experience Data team at Gusto. They are responsible for monitoring and diagnosing over 70 customer satisfaction and retention metrics, leveraging data gathered from 100,000 businesses. A single customer health score is built on hundreds of individual dimensions and hundreds of millions of combinations that could explain metric performance at any given time.
SQL-based tools, notebooks, and visualization aren’t enough to identify the factors that matter in this environment in time to act. The next generation of business intelligence and data tools will automatically and continuously explore these rich datasets, identify specific attributes and behaviors that impact metrics, and confidently inform action.
Cloud data requires a cloud-native infrastructure
One key reason these legacy platforms will struggle to actually deliver augmented analytics capabilities is architectural. These tools were not originally built for cloud architectures or cloud-native data. As a result, they need to make significant tradeoffs in power, speed, or coverage to compensate.
The size, velocity, and dimensionality of enterprise data requires a completely new kind of analytics engine. It’s what we’ve built at Sisu after launching the original engine as part of the Stanford DAWN project.
From that research, we’ve developed a novel analytics engine that can quickly and comprehensively explore the most complex data, find the highest-impact factors, and continuously point users to the reasons why these metrics are changing. Designed for highly-parallel processing, Sisu’s cloud-native architecture enables incredibly efficient query execution (with enterprise-grade security to boot). In addition, Sisu applies novel search algorithms that prioritize analysis over the most promising statistical results. This allows us to focus on the highest-impact populations in any dataset. Finally, Sisu uses a number of statistical signals and weakly-supervised user input to continuously improve the ranking and relevance of facts we present.
In short, cloud-scale analytics are core to the DNA of Sisu.
On a collision course
While it is exciting to see the pioneers in the field of BI build features like Tableau’s Explain Data and Power BI’s “Automatic Insights,” which layer AI/ML-powered results on top of existing visualization engines, it’s not enough.
Based on our experiences in research and with customers, we think that fully extracting the potential in enterprise data can’t just be a new layer – it requires a complete architectural rewrite, with scalable and powerful AI/ML at the core of the product experience and visualization layered above. We believe the market is much like search — the results (provided by the search engine) will matter most, not the visualization (provided by the browser).
To get a quick preview of the quadrant and more of Alex Woodie’s annual breakdown, head on over to Datanami for the full article.
When you’re done, come on back and we’ll show you how Sisu’s fast, powerful augmented analytics platform can accelerate your data exploration and get you from data to decisions in seconds.