stock-bigdata

Big Data, the Cloud, and Why Management Matters

1 comment

There is a growing trend in the cloud world. It’s clearly evident that there are now more devices, more connections to the Internet, and more demands from the end-user. Organizations already have to battle the BYOD and IT consumerization front. Now, these same companies and IT environments have a new challenge to face: big data.

The average user already utilizes 3-5 devices to access a corporate data center. As more devices come online and connect to the cloud, there will, unquestionably, be more data that needs to be managed and quantified. Here’s the interesting part – small organizations are facing the device and user challenge as well.

A recent GE Capital study outlined how demands on the ever-growing complexity of data center assets continue to evolve as mobile broadband expands, big data analytics becomes more prominent, social media attempts to replace basic email as the holy grail for marketers and cloud services become more entrenched in both consumer and corporate America.

Consider this specific example from the study – Capital spend on server, storage and cloud infrastructure for purposes of supporting Big Data efforts on a global basis are anticipated to increase at a 37.6% CAGR between 2012 and 2016 and with it, comes increased demand for housing the incremental equipment at a data center. Corporations are increasingly digitizing and analyzing all forms of structured as well as unstructured data for purposes of competing both locally and globally.

Cloud services are expected to continue to grow as technology complexity evolves with an increasing array of both applications and devices being utilized by an increasingly mobile workforce.

Today, verticals across the board are giving us examples around the explosion of data. Let’s analyze some numbers:

  • According to IBM, the end-user community has, so far, created over 2.7 zettabytes of data. In fact, so much data has been created so quickly, that 90% of the data in the world today has been created in the last two years alone.
  • As of 2012, there is over 240 Terabytes of information that has been collected by the US Library of Congress.
  • Facebook process over 220 billion photos from its entire user base; with another 300 million photos being uploaded every single day. Furthermore, they store, analyze, and access over 32 Petabytes of user-generated data.
  • In 2012, the Obama administrators officially announced the Big Data Research and Development Initiative. Now, there is over $200 million invested into big data research projects.
  • In a recent market study, research firm IDC released a forecast that shows the big data market is growing from $3.2 billion in 2010 to $16.9 billion in 2015.

Here’s the reality, however – what good is a large amount of data if it can’t be analyzed or quantified? Furthermore, there is the need to manage such large data sets over a cloud or WAN environment.

Hadoop, for example, has become the unofficial standard behind big data management engines. Still, these platforms are very new and require a new type of skillset to deploy and manage. The other challenge is that there aren’t too many people who can speak the language of big data. Remember, all of these technologies are still emerging and only just becoming and adopted practice for big data management.

In fact, according to a recent Ventana Research report (done in conjunction with Karmasphere), statistics clearly show just how new this big data management technology really is. In this report, Ventana polled IT managers, developers and data analysts throughout hundreds of companies of various sizes. These companies also spanned multiple industries. What did the report say? Overwhelmingly, 94 percent of Hadoop users perform analytics on large volumes of data not possible before; 88 percent analyze data in greater detail; while 82 percent can now retain more of their data. This means that our capabilities around controlling and analyzing data are continuing to improve.

What this means to you and your business

As the GE Capital study points out, the total number of devices worldwide that are connected to the Internet is forecasted to increase at a 15% CAGR during the current decade, driving the need for increase computing power. At the same time, the number of connected devices per person will increase from less than 2 in 2010 to nearly 7 by 2020.

This means that regardless of your industry, vertical, or even company size – the information you have around your business and your users is critical to your success in this market. New kinds of analysis tools will allow you to integrate with new kinds of resources. All of this enables you and your organization to obtain deeper levels of understanding around the market. In working with big data tools and management solutions, consider the following:

  • Look for technologies that are beyond their 1.0 days. If you’re a new-comer to the world of big data; use tools that allow for an easier time understanding your data. Cleaner UIs, better correlation engines, and even use-cases specific analytics can go a long way. Remember, your data set is unique – the solution you work with must meet your specific needs.
  • Look for cloud integration and automation. This goes as a basic standard around big data management. You’ll also want to look for good API integration as well as capabilities to integrate with underlying storage systems and repositories. Big data engines will often have partnerships or integration points with large storage providers.
  • Look for adaptability and scale. Your data can be as elastic as your business. The way you manage your big data needs should be no different. For larger organizations, creating data zones for analysis might be the standard. For smaller use-cases, companies might be bouncing vast amounts of data between a local point and one in the cloud. Regardless, your architecture must be agile and efficient. Latency, poor processing times, and architecture inefficiency can lead to slow response times. And, in a world of data-on-demand, this could be detrimental to the business.

Remember, the cloud can be a very powerful platform. It has the capability to streamline process and deliver large applications or even desktops almost seamlessly to any device given an Internet connection. The popularity of the cloud model is growing – but so is the management concern. In designing a solid cloud platform, take the time to understand all of the underlying components, how to best manage them, and how to optimize the value of your data.

Newsletters

Subscribe Now and Get Our Exclusive Report on "The Hosting Infrastructure Ecosystem"

Enter your email to receive messages about offerings by Penton, its brands, affiliates and/or third-party partners, consistent with Penton's Privacy Policy.

Related Forum Threads

About the Author

Bill Kleyman is a veteran, enthusiastic technologist with experience in data center design, management and deployment. His architecture work includes virtualization and cloud deployments as well as business network design and implementation. Currently, Bill works as VP of Strategy and Innovation at MTM Technologies, a Stamford, CT based consulting firm.

Add Your Comments

  • (will not be published)

One Comment

  1. Hi, great piece and we agree with every final point you made, but.... you forgot one: SERVICE. It isn't enough to check all the other points off the list and miss this vital component. Look for a company that is supremely dedicated to their customers. They must live and breathe your business, know your name and be there for you- 24/7 with a real person on the phone. It shouldn't matter if you are spending $150/mo or $1k- you should feel like you are their only customer, their best customer and they are a part of your trusted IT staff. If you are missing that, what good does a cheaper platform do for you when it goes down or if connectivity goes wrong? It's not a matter of 'if', but 'when'.

    Reply