Hi. Welcome to Ingram Micro.

Please choose your role, so we can direct you to what you’re looking for.

If you’d like to learn more about Ingram Micro global initiatives and operations, visit ingrammicro.com.

5 Ways to Manage Financial Risk on Big Data Strategy

June 09, 2017

5 Ways to Manage Financial Risk on Big Data Strategy

When advising customers about their big data strategy it’s tempting to encourage them to build an elaborate infrastructure that can grow with their needs. After all, the value of big data has been proven, and the more data you can store and the more computing power you can apply the better the results, right?

The truth is that while companies rush to embrace big data, the tools and technology are still maturing. Be sure your big data strategy can deliver on the big data promise. The risks may outweigh the rewards, since big data opens potential for security breaches and financial exposure. Some early big data adopters are mistaking noise for insight, and they are spending time and money to produce information that has little value.

IDG’s latest big data enterprise survey reveals that enterprises will spend an average of $8 million this year on big data. Of those surveyed, 70 percent of enterprise organizations have already deployed or plan to deploy a big data project, and 56 percent of small to media business (SMBs) are planning a big data initiative. In order for big data to succeed, businesses must: a) identify areas where big data will have the biggest impact; (37%), make sure they have sufficient staff to extract big data value (29%); and c) have the networking and storage capacity to support big data (25%).

All these “must haves” can mean a big investment that may not pay off. Big data is over-hyped, so don’t oversell your big data strategy. Test big data value’s before committing to a bigger financial risk.

The Lure of Cheap Data

As tech historian George Dyson said, "Big data is what happened when the cost of keeping information became less than the cost of throwing it away."

Hadoop has made data cheaper to store and to access. Ad firm Neustar said they it cost $100,000 per terabyte to store one percent of their data in a data warehouse. Using Hadoop, they now store 100 percent of their data at a total cost of $900 per terabyte. Hadoop now makes it cheaper to store data than dispose of it.

However, just because data is cheap to store does that mean it’s useful. Even if data storage is cheap, will the cost of sifting through that mountain of data yield enough nuggets of business insight to justify the cost?

The Real Cost of Big Data

While data storage is cheap, extracting intelligence from big data is not. Managing and integrating big data into the enterprise incurs costs in added hardware and software costs, as well as new tools for security, compliance, disaster recovery, and availability.

And then there is the cost of analytics. To leverage Hadoop you need to data scientists to write programs and to do the analysis. You have to bring in a whole new skill base that may cost more than the infrastructure.

Managing Big Data Costs

Here are five tactics to include in your big data strategy to prevent big data costs from getting out of hand:

  1. Plan in advance – Work with the customer to identify the stakeholders and determine what insight they want from big data. Agree in advance on the objectives and scope of the project. Success depends on setting expectations and defining a big data project in a way that will yield actionable insight with measurable ROI.
  2. Develop a use case – Next develop a use case and assemble your resources. Use this as a test case, so you can limit the number of data sources and the cost of analytics development. The objective is to test the use case to assess big data value. Once you have proven the use case, you can justify a bigger investment.
  3. Don’t overinvest in infrastructure – Even though storage is cheap, don’t start installing more storage and servers. Use cloud services to run your use case. If you limit the infrastructure investment at the outset it will be easier to justify it later when you have demonstrated ROI.
  4. Test your analytics – Make sure your analytics deliver the actionable insight required, not just more data noise.
  5. Refine and repeat – If your test case fails to deliver the desired results, identify the weak points, refine your process, and try again. Once you start to see positive results that prove ROI you can start expanding the framework, adding new infrastructure, and hiring staff to generate even greater returns.

Big data is still in its infancy, so it’s better to teach your customers to walk before budgeting for a marathon.  Show customers how to test the waters and assess the value of the results to prove the ROI of big data, then commit more resources. Where do you start with your customers when proposing a big data strategy?