Hi. Welcome to Ingram Micro.

Please choose your role, so we can direct you to what you’re looking for.

If you’d like to learn more about Ingram Micro global initiatives and operations, visit ingrammicro.com.

Understanding Big Data Analytics for VARs

December 10, 2017

Big data is a big investment for most organizations. It requires new enterprise resources, data storage, cloud resources, programming expertise, and more. But the payoff in business insight is worth the investment. Understanding big data is not just a matter of understanding the systems and infrastructure needed to assess the data. Understanding big data requires a working knowledge of the end-to-end analytics process, including the preliminary steps, the tools required for analytics, and how to present the outcome.

A new study by NewVantage Partners reveals that more than 67 percent of those surveyed now have big data initiatives in operation. Only 4 percent report no plans to apply big data at all. And the investment in big data is expected to grow from 35 percent to 75 percent by 2017 for investments over $10 million, and growth from 6 to 28 percent for big data investments of greater than $50 million. Clearly understanding big data pays off for business as well as for VARs looking to sell big data solutions.

Understanding big data also requires breaking it down into its basic processes and components and evaluating how the pieces fit together.

The Big Data Process

Understanding big data starts with creating a well-defined use case. Big data brings together multiple data resources to reveal new insights that point to a specific business action. In order to have a big data use case, the business question being asked has to:

  1. Benefit from multiple data sources – Big data is designed to look beyond business intelligence, which means data that is outside of the organization’s database. Big data benefits from disparate data sources, including structured database information and unstructured data such as word documents or social media. It also means looking across the company to incorporate data siloed in specific departments or business groups.
  2. Result in an action plan – The insight from big data should deliver ongoing value, such as assessing the competition or tracking product appeal. If you are looking at sales figures for quarterly performance, you don’t need big data, but if you want to bring together multiple resources to understand, say, customer sentiment, that’s an ongoing big data engagement with repeatable results.
  3. Create lasting value – The results need to deliver a measurable, ongoing return. The results of the big data project should be able to deliver insight that will continue to shape business strategy in a meaningful and profitable way.

The Big Data Tools

Once the use case has been developed, you need to determine what big data resources are available. This is not only hardware and software, but also the data sets available in-house, the data needed from external sources, and the analytical capabilities to assess all that data.

The use case will provide a framework for data flow and show you what’s required. Is there adequate data storage available? Do you need a different type of storage such as Network Attached Storage (NAS) or an object data storage schema to deliver data faster? What about processing power? Do you need virtual computing resources or a parallel computing infrastructure to handle the flood of data? Your infrastructure has to be able to handle high-speed I/O bandwidth as well as storing terabytes or even petabytes of data.

And then there’s the software. Every big data engagement is customized so there are no real off-the-shelf tools. That means you need to build your own analytics resources using Hadoop and NoSQL.

Apache Hadoop is still the dominant programming framework for big data analytics. Understanding big data programming requires knowledge of Hadoop, or a working knowledge of Java, Linux, SQL, SQL, Java Script Object Modeling (JSOM), and XML which will make it easier to learn Hadoop, as well as high-level languages like Pig.

The Outcome

Understanding big data results is the next step. Gathering the raw results from the analysis is only the first step. Now you have to interpret the results to deliver the desired insight.

This is the step where data scientists or statisticians typically step in; someone who understands statistical analysis and can convert the findings into dominant trends and outliers. Expertise in R programming or similar statistical computing environments is useful. In any case, expertise in business intelligence as well as sales, marketing, and marketing automation adapt well to big data interpretation

What does all this mean for VARs? Understanding big data also means understanding the role you can play in delivering the end results. Identifying vertical markets such as finance, healthcare, or retail will give you an advantage as well, since it adds business expertise to understanding big data. Once you see the end-to-end big data process, you can determine where you add the most value, and where you need to enlist partners to fill in the gaps.