As Cloud Adoption Grows, NetApp Focuses on a Software-Defined Approach to Storage and Data

Add Your Comments

TORONTO, CANADA – Executives from storage and data management company NetApp were in Toronto on Wednesday to meet with Canadian customers and channel partners, and were eager to explain how storage solutions can help in an IT world increasingly defined by flash and cloud.

CPU speeds have consistently increasing based on Moore’s Law, and network connectivity has done from dial-up to broadband. And yet storage has lagged behind.

“Storage is the number-one bottleneck in the industry today,” said Lee Caswell, NetApp’s VP of Product, Solutions, and Services Marketing. And flash storage with its 10 to 20-fold performance gain is part of the solution.

He also mentioned that storage performance also affects CPU usage – and it’s common in a data center for CPUs to be running at around 40 percent because it’s just waiting for storage. “Flash dominates disk on every aspect except for price today,” Caswell said. He said flash is being adopted as costs come down, but it’s also because of the management costs because it makes it easier for staff (often the company’s best and brightest individuals) to debug performance problems and tuning.

Read more: NetApp Buys All-Flash Storage Vendor SolidFire for $870M

It’s About the Data; Not the Disks

A central part of NetApp’s mission is to make it possible to move and manage data across the various environments – basically, since data is no longer centralized in the data center, it may be important to provide privacy, security and sovereignty assurances to data wherever it lies.

“We think that what is increasingly relevant around storage and about systems is not the disk drives,” said NetApp CEO George Kurian. “We think the real value that storage offers is access to peoples’ data, and we think that’s a software-defined paradigm. Yes, we build some of the world’s best systems, and we’ll continue to do that, but what you’ll see from us is to continue to use software to blur the boundaries of systems, blur the boundaries of data centers, and help you most importantly get value from your data.”

Storage is playing a role in data privacy, security and other topics that it didn’t used to be in, and this might have been hard to predict in an era before the massive distribution of data enabled through cloud computing.

NetApp Sees Long-Term Opportunities in Storage Software; Not in Inventing New “Mousetraps”

With its recent acquisition of flash storage platform provider SolidFire, Netapp is a consolidator but it has also been fostering a culture of in-house innovation that has helped it remain relevant for decades.

Smaller startups in the space, Kurian said, are facing a “brutal” landscape with high burnout rates, and venture capital inflows down 70 percent year-on-year. He said the underlying problem is that they’re haven’t “thought about the big picture of innovation,” and are focused on building “a better mousetrap that solves a niche problem than really thinking about the next decade of IT.”

He sees a similar deficit of innovation among large competitors. “One of leading competitors is about to be swallowed up by a PC manufacturer,” he said, obviously hinting at Dell’s acquisition of EMC. The basis for the transaction, he said, is that they would reinvent the data center by taking the ideas from the PC industry. In his view, the partnership between Microsoft and Intel led to the creation of the PC. “The others” such as HP and Dell were “simply building shell cases around the innovations that Microsoft and Intel drove.”

Read more: Managed Service Providers: Apply Now for MSPmentor’s MSP 501 List

Keeping Data Safe and Manageable Wherever It’s Located

Caswell said that many companies are choosing a variety of architectures. Especially outside the US, many companies choose on-premise clouds or local cloud services where their data sovereignty is more assured and free from being subpoenaed by authorities in countries through which it passes.

But there are many instances where public cloud makes economic sense. It isn’t always the cheapest option on a per-gigabyte basis, but its flexibility and scalability might make it necessary.

“Customers are thinking about the public cloud as a tool,” Caswell said, noting the popular configuration of using flash on-prem and disk on the cloud. “Flash for performance and management locally, and maybe only the cloud has the scale to justify the still real but increasingly nominal cost differences between disk and flash.”

Having this data fabric to transfer data between and enforce data governance starts to become important in such a scenario.

And with various environments, the other issue that becomes vital is software interoperability, and keeping software stacks open to different technologies because they’re liable to change just as much as an organization’s needs do. This is another area where NetApp considers itself to have an advantage.

“Every other [large] competitor to us is trying to build closed stacks,” he said, referring to Dell and EMC and their integrated software. “My view on that is that they’re working on the wrong problem, which is that they think the data center is all about hardware and I think that our view is that the innovation in the data center is about software.”

While it’s tempting to think that storage is just a device that can be plugged into a data center, it’s becoming more crucial to think of storage being within a software-defined model.

“The problem of data management spans the boundaries of any physical system, and that is the fatal flaw in Michael [Dell]’s analysis,” Kurian said. To take a historical perspective, he explained how data sat in the data center before PCs came along and took a lot of this data to the desktop. Then just when Microsoft thought they had control of all the world’s data, the internet came along and a huge amount of the world’s data went to the internet. Then just when Google thought they had control of the world’s data, the mobile phone put a lot of the world’s data into our pockets. This, he said, is the flaw of thinking about data management as a system problem.

“What you really need to do to help people manage their data is to bring your tools to the places where they want to put their data,” he said. “And that’s the idea of data fabric: it’s software defined; it helps you get the flexibility from data management architectures you need in the next year of IT.”


Subscribe Now and Get Our Exclusive Report on "The Hosting Infrastructure Ecosystem"

Enter your email to receive messages about offerings by Penton, its brands, affiliates and/or third-party partners, consistent with Penton's Privacy Policy.

Related Forum Threads

Add Your Comments

  • (will not be published)