The Big Picture of Big Data: Why using it is imperative

This post has already been read 2831 times!
0 Flares Twitter 0 Facebook 0 0 Flares ×

by Patricia Rae Linn

Big Data

Success requires that the analysis understand and respond to the four “Vs” of Big Data.

Organizations have or have access to plenty of data that can give them extreme differentiation and competitive edge, but most companies, government agencies and not-for-profits struggle with turning data into insights that enable timely, actionable, effective and measurable strategy. Even the organizations that do optimize enterprise, machine, social and other data battle challenges that make the Big-Data-to-Intelligent-Enterprise (i.e.) translation and application undesirably slow and expensive.

By the year 2020, IDC (International Data Corporation) predicts Big Data will reach “40 zettabytes (ZB), which is 40 trillion GB of data, or 5,200 GB of data for every person on Earth.” Currently only 1% of Big Data is analyzed.[i]

The “Big Data Imperative,” for organizations is to master insight-based strategic planning and execution using whatever elements of the massive data available. The task then is to apply the data to what they do and make it faster and better than their competitors. Success requires that the analysis understand and respond to the four “Vs” of Big Data:

  • Value — It’s imperative that organizations have the ability to identify what data has value, where to obtain it, what the value is, how to extract and analyze that data for actionable meaning, and how to implement its use in such a way as to create a data feedback loop that measures results.
  • Volume — With billions of meaningful transactions, communications and results of processes being generated every second, anyone who intends to use Big Data must have the facility and capacity to manage the data from storage through prioritization to analysis. Speed to insight is the key to this weighty process in giving the organization the desired differentiation outcomes.
  • Variety — Variety is abundant in Big Data in both source and content as well as multi-purposability. For example, any single data string may analyze out to indicate the resulting insights applies to more than one division, process, goal, etc. of a company. Consequently, variety occurs at both the input and output stages of the data-to-insight-to-action process, and organizations must have systems in place to complete the process successfully.
  • Velocity — The speed at which data is created and becomes available is nothing short of daunting to both humans and information processing machines. Those who manage data have to make fast and tough decisions about the size of data samples needed, when the data stream starts and is shut off, how much integrity it has and if it is worthy of analysis. Another consideration is whether the data has been compromised in any way that might put an organization’s decision making system at risk.

Ultimately, every organization that takes the Big Data Imperative to heart and recognizes it is the cornerstone of future business success, must become an Intelligent Enterprise (i.e.) that consolidates the best practices in people, process and technology for competitive differentiation. As such, the data management professionals in successful organizations must understand, be capable of acting upon and manage fast and accurate systems that deal with the following:

Why the Imperative

Every industry and every business functions within each industry and is influenced daily by big data.[ii]

“Leaders in every sector will have to grapple with the implications of big data, not just a few data-oriented managers.”[iii] (McKinsey Group)

Simply put, Big-Data-driven insight will be what drives business by 2020. For every organization, the four core benefits of using Big Data analytics for strategy-driving insight are:

  1. Customer Growth and Retention: Data-driven insight improves organizations’ ability to identify the right customers, attract them, convert them, retain them by preempting defection, and transform them into brand ambassadors that sell the organization voluntarily. This can happen when allowing company teams to hear exactly what customers want, how they want it, and what they think of their experience with the company by reacting to information on a timely basis.
  2. Profitability and Sustainability: By selling the right value proposition to the right customer the right way, every time, organizations reduce the waste of returns, unwanted inventory and doing customer service damage control, increase volume, increase margins and keep products and services fresh and in demand based on customer feedback… all these profoundly and consistently impact the bottom line in a positive way.
  3. Process Improvement and Change Management: An intelligent, data-driven enterprise virtually eliminates the guesswork in innovation, sales and marketing and creation of the customer experience. This helps to streamline those processes into a pathway to the anticipated outcomes. These processes need to be clear, concise and the outcome results to be predictable. When outcomes are clear at the outset and have inspiring dividends, and team members know what pathway will deliver them, getting those processes to align with the goals and outcomes, changes the way things are done and makes it easier to accomplish.
  4. Product and Service Innovation and R&D: The customer is (almost) always right. While great innovators develop products and services customers want, those consumers create data that makes their needs and desires very clear….. and their disappointments obvious. Indeed, a solid communicative customer base can virtually do away with the necessity to manage and compensate high-end innovation teams.

About Big Data

  • Big Data comes from almost everything… Sensors used to gather climate information, posts to social media sites, digital pictures and videos, transaction records, cell phone GPS signals, and more.
  • Each day, people create 2.5 quintillion bytes of data, both intentionally and accidently; for example by sending 1 billion Tweets daily and 30 billion messages on Facebook each month.
  • 90% of all data is unstructured, has a lot of noise built in, is frequently corrupted, and almost always has value.[iv] Organizations will have to have app-driven analytic infrastructures to handle this massive inflow of data.
  • Big Data Statistics[v]:
    • By 2014, 90% of Big Data would have been created in 2012 and 2013
    • In 2012, the Big Data market was valued at $10.2 billion; it will be a $53 billion. market by 2017
    • 70% of Big Data is generated by people not machines
    • Organizations self-store about 80% of the data they create and use.
    • The Internet of Things experiences the addition of 570 new websites — data sources and generators — every minute of every day.
    • By 2020, 33% of Big Data will be in the cloud.
    • A 10% increase in data accessibility translates into an additional $66 million in net income for the average Fortune 1000 company.
    • 65% of senior executives are basing management decisions on Big Data analytics insight.

Trends and Technologies

  • Data Storage/the Cloud: With annual increases of 60% in data growth, storage demand is increasing too rapidly for technology to keep up. Cloud technologies, the current “big hope” for storage and dissemination of big data, offers performance well below 30-minute-transfer times for 1 terabyte of data (considered good performance for single-drive read).
  • Ability to combine Big Data sets and still obtain insightful information via complex queries quickly because of increased processing capability and speed.
  • Open-source data-integration tools that allow diverse data types and sources to be blended into analysis-ready data sets of any scope.
  • “The emergence of the “industrial Internet: This is GE’s term for the Internet of things — and big data platforms. The challenge, he said, is creating open platforms for ingesting and sharing high-scale data, and building applications while also ensuring security.” (Jim Fowler, CIO GE Water & Power)[vi]
  • 2014 will be the year of: “What can I do with big data?” rather than: “What is big data?” [vii]
  • Real time in-memory analytics, complex event processing and ETL will combine: Serial extract, transform and load (ETL) processes will largely go away in 2014. As the velocity of data increases, especially social data, there’s more need to analyze data in real-time as a stream.[viii]
  • Master data management (MDM) single-definition data, now exclusively internal, will become external as well. This data will define situations or dimensions that solve big problems.
  • “Not Only SQL (NoSQL), developed to facilitate data analysis without strict parameters and concrete schema, will further develop into a master core system (or a few), thus consolidating NoSQL access/services.

In Summary… Business has entered an “Information Imperative Age” that is rooted in the massive data generated by people and the machines they use.

“Analyze or Atrophy” is fast becoming a mantra organizations need to follow to compete on every level of business.

The Digital Universe in 2020: Big Data, Bigger Digital Shadows, and Biggest Growth in the Far East by John Gantz and David Reinsel sponsored by EMC,[i]

[ii] The Economist 2010, Data, data everywhere, A special report on managing information


[iv] IBM 2010,

[v] Surprising Statistics About Big Data, Dennis McCafferty, 2014,

[vi] 3 Trends Driving Big Data Breakthroughs: A CIO’s View, InformationWeek 2014,

[vii] Trends in Bg Data: A Forecast for 2014, Andy Walker, CSC,

[viii] Trends in Bg Data: A Forecast for 2014, Andy Walker, CSC,


Additional Reading

Apache Spark Turbo Charges Big Data Analysis

Big Data Analytics for Inclusive Growth



If you liked this article, we'll be happy to send you one email a month to let you know the newest edition of the MetaOps/MetaExperts MegEzine has been published. Just fill the form below.