Three Ways to Increase Analyst Efficiency and Decrease Data Prep

By Brynne Henn - May 21, 2020

Conventional wisdom holds that data teams spend more than 80% of their time tackling data prep and only 20% doing valuable analysis. That’s an immense hurdle for even the most effective analytics teams to overcome. At any scale, this imbalance in focus puts a severe strain on analyst efficiency as the pressure to answer business questions increases over time.

In their cover article on Reversing the 80/20 ratio in Data Analytics, Database Trends and Applications quoted Ben Newton, Director of Operations at Sumo Logic. He correctly observed that, “Just 5 or 6 years ago, innovative companies were satisfied with one- or even multiple-day delays for insights from their data. That is no longer the case. Many companies have only hours, or even minutes, to respond to user behavior and market trends. The companies that are winning are basing much of their competitive muscle on the ability to leverage their data effectively and quickly.”

A growing business cannot afford to have its analysts spending 4 out of every 5 days on the rote work of data preparation – it’s unsustainable and forces businesses to rely on gut feel and assumptions to make decisions in the moment.

If you’re feeling stretched thin by similar pressures on your team, there are three things we recommend that can both increase the amount of time you have to devote to creative analysis and make that newfound time more effective.

Use the data you already have. 

“You can’t afford to spend hours cleaning up data, nor can you waste time worrying whether your data is good enough.” – Peter Bailis, Sisu CEO 

Long gone are the “good times” when you could afford to wait for better, cleaner, more normalized data. Research shows that the pace of decision-making in most organizations is rapidly increasing, and the difference between making a poorly-informed “gut call” versus a well-informed decision can be only a few minutes. Fortunately, most businesses are flush with rich transactional data already, as every piece they’re collecting today has more context, and more features, than ever before. The challenge is putting that data to work without arduous data prep.

Unfortunately, we’re all trained on tools that expect more straightforward data, and we’re wasting time simplifying data and often hiding valuable factors in the process. It’s a big drag on analyst efficiency overall. 

Organizations seek perfection too often in their drive to have the cleanest data possible, down to the transaction level, even if the data is only being used for strategic purposes such as trend analysis. Organizations need to be more efficient and prepare their data to the level that supports the detail of analysis they need to do.” – Glen Rabie, CEO of Yellowfin

To get answers from your data faster, you have to break free of these two fallacies. With proactive, ML-powered analytics platforms like Sisu, you no longer need to labor over feature selection and “perfectly clean data.” Moreover, you can cover far more ground faster than doing the slicing and dicing yourself. Let us find the underlying variables driving change in your metrics, and spend more of your time putting the results into context for the business.

Automate exploration (aka leave the tedious stuff to AI) 

Once you’ve made the shift to working with flat, wide tables again, expecting anyone to manually review the hundreds of thousands of possible hypotheses in the data is unrealistic. This problem is where a cloud-native analytics engine really shows its value. Let Sisu handle the repetitive, rote work of exploring the data and showing you where to focus your attention. In turn, you’ll see a huge increase in analyst efficiency. In their recent report, Ventana Research argues,

“To evaluate such large volumes of information requires new types of analytical techniques based on artificial intelligence and machine learning algorithms. Evaluating the myriad of combinations can be complex and time consuming. Using AI/ML can speed up the process of diagnosing the root causes of observed changes and determining what actions to take in response.” – David Menninger, Ventana Research

This kind of large-scale, high-speed automation frees you up to do the creative work of interpreting the facts, diving deeper into interesting populations, and really telling the story of what’s going on.

Build a more proactive process 

Finally, to maximize the impact of your newfound analytics capacity, consider making proactive changes to the primary inquiry and data collaboration processes across your organization.

These can be as simple as opening up the lines of communication across different functions. We’re seeing this happen organically across all of our customers as they adapt to a faster pace of questions and answers. Peter put it this way, “Teams should build processes where analysts and business stakeholders meet to discuss new questions and interpretations on a weekly basis to take advantage of new datasets and business opportunities as they arise.”

There are multiple ways to accomplish this. If you have a centralized data and analytics center of excellence, look to embed individual experts into the business units. Already decentralized? Use Sisu as a catalyst to inspire collaboration on a few core strategic KPIs that cross organizational boundaries. However you tackle it, use this increased speed and capacity to your benefit. It’s incredible how powerful even an extra hour a week can make you feel. Making your analyst team more efficient is a multiplier effect. By giving them more time to do the valuable work of analysis, you’re also ensuring that analysis is more valuable and actionable. To find out if Sisu can help improve your team’s efficiency, get in touch for a demo

Read more

When Everything Changes, Go Back to the Facts

Why co