Blog - Analytics predictions for 2016

  • Share
  • Share
Image of computer screen looking at charts

Analytics predictions for 2016

Nine predictions for the analytics space in the year ahead, as forecasted by Slalom San Francisco’s Kyle Roemer.

Kyle Roemer | January 29, 2016

Yes, this is yet another set of predictions you didn’t ask for, but hopefully it will shed some light on the ever-evolving analytics space in the year ahead.

In my role as leader in our information management and analytics practice, I have the privilege of talking to companies large and small. It’s my job to understand not just what is taking place in the analytics market, but also to anticipate where things are heading.

Here are my predictions for 2016.

1. Data viz and enterprise BI will quickly follow the web tech and digital evolution

I’m using web tech and the digital design evolution as a lens for where data visualization and analytics apps are heading. Too long have analytics applications lagged behind modern design and development paradigms—customers are now demanding their enterprise analytics apps work in a similar fashion as the websites and apps they use in their daily lives.

Web tech evolution graphic showing how the amount of technology leaps in the last 5 to 7 years has transformed the way we think about interfaces and interactions

Web tech evolution as visualized by evolutionoftheweb.com: We can expect to see data visualization and analytics apps to follow this evolution more quickly than prior years.



A lingering design principle in web-based analytics solutions is designing for many user roles: This results in cumbersome user flows and makes it challenging for users to find the most relevant content. While it’s necessary to have multiple roles within an analytics application due to security implications and functionality needs, it’s more important to start simple and allow the user to discover additional functionality.

Pinterest and Tumblr both do this well. While there are social aspects to content discovery on those sites, I think analytics applications can learn from the practices they employ.

More modern design practices are being employed at new analytics companies, but many of these experiences are unfortunately similar. When you look at startups in the data viz space like DataHero (recently acquired), Chartio, or Argo, you don’t see much differentiation, which brings me to my next thought…

2. New data viz and analytics companies must differentiate themselves

In 2016, “can connect to your favorite cloud source” isn’t enough.

From 2013 to 2015, it was enough if data viz and analytics startups (not including predictive, large data, or storage-related startups) were: easy to use, compatible with that cloud source you like, and inexpensive. While there was and is a place in the market for these companies, this has been done at length now.

So, who is going to tackle the lack of collaboration in analytics today? ClearStory Data is trying, but do we yet understand how people want to collaborate with data?

And who will make a mobile experience that people want to use? Vizable is interesting from Tableau, but it’s just the start.

New analytics companies started this year will hopefully move beyond the recent “simple interface, connect to cloud sources” paradigm and focus on unsolved issues in the enterprise.

3. Deploying predictive models will get easier

Data science teams building advanced models have a continued need to operationalize, deploy, and productionalize. This isn’t a new problem, but the proliferation of data science in the enterprise has magnified the challenge, and standards like PMML and PFA haven’t quite gone mainstream yet.

The reality is, data scientists like to use different languages to build their models (R, Python, Java, etc), and often don’t have the desire or background to automate and deploy models, regardless of scale.

Commonly, a model is built by a data science team, and then handed off to a data engineering team to deploy and automate. I rarely see this work well because the models can be complex and require validation, and things get lost along the way in a highly iterative, discovery-based environment. (The Data Mining Group explains it well.)

I expect a company or two to solve for this, likely leveraging these standards but lowering the barrier of learning and deploying the standards. Zementis, for example, is focused in this area with an emphasis on large-scale deployments and datasets.

4. Data viz programming libraries will get easier, too

Libraries like D3.js continue to gain momentum with analytics groups, and I don’t foresee this declining. Given D3’s prominence in the market, I expect it to become easier and easier to build visualizations leveraging these libraries.

I also expect more abstraction to lower the bar to building custom visualizations like C3.js or the fantastic work being done on Vega by Jeffrey Heer. (This type of web-first based approach to data viz is the path forward.) It’s always exciting to see the continued evolution of the D3 library by Mike Bostock, as well as projects like Block Builder by Ian Johnson.

5. Core analyst skills will become [even more] paramount

It’s now easier than ever to integrate data sources and visually explore and run advanced analytical models on that data. With tools like Tableau, Qlik, Alteryx, and Paxata, the bar has been lowered. People can now explore and analyze their data in ways never available to them before. Finance, marketing, and HR analysts are empowered to do things only “technical” individuals could do in the past. It’s been incredible to see at the clients I serve!

With this ease of use comes the need to have foundational skills and understanding of data to avoid mistakes and false conclusions.

The idea that you can drag an R model in Alteryx to a dataset to predict an outcome is amazing; but, if you don’t know how to interpret or understand what is taking place in that model, it could lead to false conclusions. Many have written on this, but I hope that these tools are accompanied by continued education on data and statistical analysis in the workplace. Those core skills will continue to be paramount in drawing data-driven conclusions.

6. Innovation will strive to solve computing at scale

Hadoop will continue to be deployed at a higher frequency, but we’ll see innovation on solving computing at scale with common languages and skills.

Computing at scale on commodity hardware has worked for a number of enterprises, although the silver-bullet aspects of Hadoop have lessened. (That’s a good thing.) Given the complexity of the Hadoop ecosystem and the need for learning and supporting new languages, I expect some innovation on lowering that barrier.

We’ll see more push toward common languages like SQL and standard DBA skills, versus a proliferation of languages like Hive and Pig. The Cosmos platform from Microsoft is a step in the right direction, and I’m interested in what MapD will bring to the table as well.

7. Data viz mashups will trump “one tool across the enterprise”

Large enterprises often want to standardize on a single reporting, data visualization, or analytics tool. Though there are many pros to this argument, the reality is: People use the tool they want to use, whether it’s simple, powerful, or the one they’ve used for many years. This year, I expect companies to start considering mashups—where they’ll allow users to combine data visualizations from multiple tools that are supported within the organization.

The legacy word for this was a BI Portal, but those were mostly informational in nature to help departments understand how to get a license, trained, etc on the tools available to them.

The idea of a data viz mashup goes a step further—allowing users to combine visualizations from all the tools they use to curate experiences and stories. Doing this elegantly won’t be easy because every tool’s API is different, and there isn’t a great solution today to bridge these APIs. There’s also an inherent tool knowledge-gap: In order to create a mashup, an investment must be made to learn the ins and outs of these tools.

All that complexity aside, I think there are numerous use cases to make this a compelling solution. Imagine, for instance, being able to update your Google Sheet data that feeds your Tableau visualization in real-time. Expect to see more of this in 2016 … and the idea of “one tool to rule them all” to slowly fade.

8. The next Tableau will be born or revealed

Over the last couple years, I’ve hypothesized at length with my colleagues on who the next Tableau will be.

Tableau has built an incredible following and an amazing product and matured as a company—and it’s now the enterprise standard in data visualization for many major companies (replacing traditional players like Oracle, IBM, MicroStrategy and Microsoft). While Tableau is currently the “it” toolset for analysts, this can quickly change.

In 2016, I expect Tableau’s hold on the market to continue—with hopefully an ever-evolving product—but I also expect a newcomer to make a lot of noise.

I anticipate that a company or two will make traction this year at larger enterprises. Looker has a great chance as it truly differentiates itself with powerful SQL abstraction and analyst collaboration from being just another data visualization tool. I’ve also found ClearStory Data’s product interesting as they’ve focused on a few key areas.

My hope is that someone, somewhere starts or evolves a company that can truly compete with Tableau. They are poised to grow even more across the enterprise. Though I don’t expect someone to innovate to the degree that Tableau did in the same data viz space this year, there are huge opportunities to address the areas that they and others are not today—areas like collaboration, mobile, customization, and engineering-friendly tools are all areas where a company can gain traction.

9. Analytics companies to pay attention to in 2016

Looker: I’ve known the team for a couple years now, and their commitment to building an application for modern data companies is applause worthy. Their SQL abstraction layer is powerful and they’re very well poised to make a big jump this year. They’ve focused on evolving an interface that allows data engineers and analysts to collaborate and build models in a single environment. Lately, however, they are in the same conversations as customers considering Tableau and Qlik.

Alation: One comment that I’ve heard repeatedly across large tech, financial services, retail, and healthcare companies is: “I don’t know what data is available to me or what reports have been built.” I’ve seen a number of IT departments aim to build solutions that expose this type of information, but often they can’t be supported due to engineering cost, capacity, etc.

Alation is tackling this challenging topic, and it couples quite nicely as companies look to better govern their data. I expect Alation to make big moves in the market, or more likely, to get acquired this year.


This post was originally published by the author on Medium.

Kyle Roemer is an analytics practice leader in our San Francisco market. He strives to help companies accomplish amazing feats with data and analytics while educating them on upcoming trends and technologies in the space. Connect with Kyle on Twitter, LinkedIn, or Medium.

            

Start a conversation

What’s on your mind? Let’s explore the possibilities.