A high volume of data should contain a high volume of useful information. So why do we struggle to find facts we can use when we need them? That’s precisely the problem that Peter Bailis set out to solve when he arrived as an Assistant Professor at Stanford University in 2015.
“The core question was what do you do with massive amounts of structured data when you don’t have enough time to dig in and look at every individual event? How do you detect when a change occurs in your system, understand why it occurred, and then take action?”
Recently, Peter sat down with Jeff Meyerson on the Software Engineering Daily Podcast to discuss Peter’s research, the challenges of exploring complex enterprise data, and how thoughtful design can make data more accessible to more people.
You can find the full recording and transcript here, or subscribe to the Software Engineering Daily Podcast via iTunes or Google Play.
Looking for the tl;dr? Here’s a quick synopsis:
- How do you tackle the problem of finding useful facts in mountains of data when both your tools and your time are stretched beyond their limits?
- Peter and Jeff discuss Peter’s research at Stanford on data-intensive systems and how he addresses a problem that every enterprise faces: massive amounts of structured data and not enough resources to properly diagnose changing metrics.
- Practical examples from Peter’s work with Microsoft, Facebook and Google, and how they informed the product he’s building at Sisu.
- Why it’s necessary to develop novel end-user interfaces for the operational analyst that enable them to go deeper and deliver “business friendly” recommendations to their organizations.
- Finally, how other open source projects like Apache Spark have inspired Sisu to stay focused on running fast and iterating with customers.