While everyone can agree that cloud computing is something the government wants, the field is evolving so fast GSA had to replace its cloud-computing RFP after only a year. Currently, cloud computing is mostly used for data access and enterprisewide applications like email or sales information.
Cloud computing places core business activities in a secure, centralized location. The only cost to the business is the license fee, removing the cost of internally “owning, updating and managing an enterprise’s information systems, which can account for up to 80 percent of business costs,” according to Jake Zimmerman of Command-Control.
Mel Greer, chief strategist of cloud computing at Lockheed Martin Information Systems & Global Services, thinks cloud computing will contribute to a “seismic shift in IT development and delivery over the next five years. As IT activities devolve into business units and become consolidated with other central functions (e.g. HR, finance, etc.), cloud computing will simplify and provide external options for IT development and delivery capability.”
Greer said the rise of smart mobile devices and ubiquitous sensing will drive an exponential increase in data volume, much of it available via the cloud. In fact, Web 2.0 applications forced the software industry to invent a new term for the amount of data available through the cloud: the zettabyte. A zettabyte is one billion terabytes, and University of Pennsylvania linguistics professor Mark Lieberman estimates that any and all speech spoken by humans since the dawn of time, digitized as 16 bit audio, would take up about 42 zettabytes. That’s the scale of data that large-scale cloud applications deal with.
“Cloud computing adoption will continue to grow as infrastructure and applications become more available as virtualized, configurable and scalable services, via an easy to understand consumption model,” Greer said. “Cloud capability will extend to include desktop transformation where desktop virtualization, applications and unified communications combine to support greater workforce mobility.”
He notes that, despite the American public’s tendency to be comfortable with putting personal and private data in the cloud, “governments will be reticent to use cloud computing for classified, national security and most war fighter support functions. Security, legal and liability issues associated with trust, privacy and confidentiality will need to catch up to cloud user capabilities before the vast majority of personal and private functions migrate to the cloud.”
As with any game-changing innovation, “cloud computing is many things to many people,” said Tommy Gardner of ManTech. “The theoretical concepts of what the cloud can do will be different from the practical implementations. Market demand will drive the difference between theoretical and practical determinations about which designs stick and which fall by the wayside.”
He said, “Getting to market first is important, but only if you get there with a product that works and adds value to the computing experience,” and predicts that, in five years, information assurance and data-integrity issues will have been solved within the cloud. “In five years, the market will stabilize and standards will exist,” but, because of the scale of data in the cloud, “massively parallel systems will be slow to migrate to the cloud until issues with data latency are resolved.”
John Bordwine, Symantec’s Public Sector CTO, believes that five years from now, “cloud computing may well be much more of an application delivery environment. Applications on demand versus accessing just information in the cloud via local applications.”
“I realize this is a viable component today, but how many government agencies, or even commercial entities, will trust not just information access but the (‘full package’) to the cloud until it is a well-proven technology?” he asks.
While Bordwine said most computing functions can be accomplished from a cloud perspective, some are only right for a private cloud network to ensure proper confidentiality, integrity and availability.
“It could possibly be that a function that cannot be guaranteed an acceptable SLA for C, I and A would be a strong candidate for non-migration.”
Data security in the cloud has been a concern since the beginning of cloud computing.
“If an unauthorized user gains access to the network, they get access to an entire company’s vital information, as TJ Maxx learned last year. Faith in third-party security is essential for any cloud computing application,” said Jake Zimmerman of Command-Control.
Ultimately, the biggest obstacle to cloud computing isn’t security, but the effective implementation and execution of a cloud network. “A bad plan with good execution is better than a great plan with poor execution,” he added.
A major revolution is brewing in the cloud to address latency and availability issues, specifically: big data computing. In the ‘90’s, the only way to search through data was by crawling it bit by bit, like when you use the built-in search function for a file you can’t find on your computer. Now, parallel process computing, pioneered by Google, is making it possible to migrate data processing to the cloud.
To help explain the concept, we spoke to Sam Charrington at Appistry, the company behind CloudIQ software.
One of the things that can help put this in perspective is to look at the traditional way of delivering these data-centric applications. The best way to think about that is to think about an island of compute processing power on one end and an island of data storage on the other end. Think about a straw connecting those two. The traditional model is to suck the data through the straw from the storage to the computer and then do the processing on it.
Parallelization initially is something that was applied to that compute island. Really what parallelization itself is meant to accomplish is to say, instead of one large, big server on that compute island, let’s break that up into larger numbers of smaller, inexpensive servers on the storage island. And we do that by parallelizing the work, doing lots of pieces of work in parallel.
It’s like vertically integrating your computation and storage processes, and it’s not a new idea. “If you talk to old-schoolers, parallelization as an idea has been around for a long time,” Charrington explained. “What we are doing is extending that idea beyond just computing to storage.”
Revolutions in computing technology have enabled practical “computational storage or some people call it, locality of data, data locality. Basically, moving your compute work loads to your storage. So, you don’t have to pull any more data through that straw.”
What gives many of these revolutionary computing processes their enormous transformational potential is their background in open-source projects. Google published its MapReduce and
Google File System in publicly available lectures and whitepapers, leading to the development of Hadoop, the open-source software framework that makes up the backbone of products like Appistry’s CloudIQ and Yahoo’s search engine.
“Open source has been essential to cloud computing’s development, and will be a driving force for the foreseeable future. It’s very exciting,” said Gunnar Hellekson of Red Hat. “You can start with virtualization, which is a foundation technology for cloud computing. The open source Linux operating system forms the underpinnings for nearly all the major virtualization solutions on the market today. This is, in part, because Linux is a robust, well-supported platform for virtualization projects. Instead of building something from scratch, developers can start with software that already solves 80 percent of the problem. With that stable foundation, open source allows developers to spend their time on truly innovative approaches, rather than plumbing.”
Also, the tools developers use to manage distributed computing environments are, increasingly, based on open source, including the Eucalyptus project, the Deltacloud project, libvirt and of course Hadoop.
“Open source is especially useful in this space, not just because the tools are freely available, but because there are so few widely adopted standards,” Hellekson said. “In lieu of standards, open source represents; rough consensus and running code, which is a flexible and useful way of quickly arriving at consensus standards. This is especially useful in a space that’s moving as quickly as cloud computing.”
But Hellekson said the most important contribution of open source in cloud computing isn’t a low barrier to entry, or a great venue for consensus.
“Open source is a driver of innovation,” he said. “Successful open source projects represent the best thinking available by drawing on a large number of very clever developers. Everyone works together to produce the best solution possible. I think one of the reasons cloud computing is moving so quickly is that the underlying open source projects are moving at a tremendous pace — one that any single proprietary offering would be hard-pressed to match.”
Kevin Jackson, engineering fellow at NJVC agrees. “Most of our customers are really data intensive,” he said. “Traditionally, applications that do this have been walled-off infrastructures with walled-off information that the DoD and the intelligence Community use to do their work. This was good as long as the DoD drove technology, well in the past years, you can see there has been a tremendous shift.”
He said the technology underpinning this shift is “the ability to virtualize your compute capability and connect to multiple servers … because now you can make infrastructure decisions based upon your operational requirements.”
Security in the cloud must change as well, he said, from an infrastructure-centric model to a data-centric one.
“Traditionally, security has been about building up walls and building around your infrastructure, and then watching and inspecting everything that goes in and out of that infrastructure,” Jackson said. “The power of the cloud is that the infrastructure is virtual. So, you can’t really build those walls. Really the only thing you can do is manage the data and use security around the data.”
Major technological revolutions in cloud computing will make or break the promise of this rapidly evolving and emerging solution. Whether it’s data-centric security or large-scale open-source software projects to tackle access and latency problems, the solutions that emerge in the next five years will shape the next generation of computing.