Big data should be about solving specific problems, not just jumping on the technology bandwagon. Here are six issues to consider as you plan or refine big data projects.
Big data picture from Shutterstock
These comments come from the recent Gartner Symposium 2013 event in Queensland, where the role of big data was a recurring theme.
Ignoring big data gives the advantage to your rivals
“The premise of big data is the concept that says: we can know everything,” Gartner research director Gareth Herschel noted. “There is nothing we cannot find out if we want to know it. If you don’t have that information, that means you don’t want that information. What you don’t know is knowable, and if you don’t want it, maybe your competitors do.”
“Big” is not the most important element
“Big data is not just about volume,” Gartner vice president Kristian Steenstrup said. “If it was just about volume it would be much more solvable and a little less interesting. What is happening that is significant is the diversity of data sources and types of data.”
Steenstrup pointed to energy companies tracking power supplies and outages as an example. “The velocity and frequency of data updates may be significant. Outages may offer an extremely rapid burst of significant amounts of data in a very short space of time. This is not neat relationship data in a table. It’s not just about volume but being able to react to that.”
Don’t start with the warehouse
While it may seem logical to start a big data project by drawing on more traditional analytics resources such as existing data warehouses, that may not product the best results. “Consider walling off your data warehousing and business intelligence initiatives from doing things with big data,” Gartner analyst Ian Bertram suggested. A useful big data project is opportunity-oriented, involves experimentations, and may need systems to be altered rapidly — not something you can easily do with a data warehouse.
Use a hybrid cloud approach
[related title=”Think Big” tag=”bdtop10″ items=”1″]
While it’s unlikely that all the data you want to analyse is cloud accessible, using cloud-based analytic tools is going to become increasingly important. That will reflect a broader shift to the use of third-party data centre providers. “The current data centre market is 80 per cent private and 20 per cent public,” Gartner analyst Peter Sondergaard said. “That will shrink to 65 per cent in four years, and about 20 per cent of spending will be on hyperscale systems.”
Make sure your projects benefit customers
Analysing customer data can create thorny privacy issues. If your business is going to benefit from customer data, make sure that’s a two-way relationship. “If you’re going to collect data from your customers, make sure there’s a clear quid pro quo and that it benefits them,” Gartner analyst Douglas Laney said.
Realise you won’t always need it
There’s no value in jumping on the big data bandwagon for its own sake. Analyst Rick Howard offered government as one example.
“Governments with few exceptions doesn’t really have a big data problem or need,” he said. “There are certainly use cases for the need for granular information from multiple sources in a high-velocity, high-variety way, but we are all familiar with government having a ‘too much data’ problem as it is. We have too much information that we don’t access today. We’re not really doing petaybytes of data feeds that we have to analyse in real time.”