6 Ways Cloud Computing Will Evolve In 2012 Source: Charles Babcock
Cloud computing has become so broadly accepted that it won't rank as an exciting development for 2012. Instead, you will see a more organized, concerted application of resources to further the cloud's use in conjunction with central IT. Let's take a look at the top things we can expect from the cloud over the next year.
1. 2012: Year Of The Hybrid Cloud
The most obvious expression of the trend is the serious interest in private cloud computing, where more of the enterprise data center is given over to virtualized and automated operations, including end-user self service. Why? Because the public cloud, if still not fully trusted, is understood to be a long-term player on the landscape. The movement to internal cloud computing isn't in opposition to the public cloud. Rather, it reflects the growing sense within IT that its own environment will need to be as efficient and compatible as possible.
Interest in EC2-compatible Eucalyptus Systems, the general purpose Nimbula Director cloud operating system, and the OpenStack initiative are all signs of serious private cloud planning and implementation. VMware's cloud initiatives would be faltering if virtualization stopped at the edge of the virtualized server, but it doesn't. It extends out into storage, I/O, and networking. Managing these resources as virtualized pools is a giant step toward internal cloud computing. Dell's support for VMware's cloud software and VMware's ability to attract programmers to its Cloud Foundry all speak to interest in and use of the future private cloud.
[ VMware launched several bold initiatives in 2011. See VMware's Best And Worst Moves Of 2011. ]
2. Development Moves In
Speaking of Cloud Foundry, the unusual open source initiative (unusual for VMware, a strongly proprietary company) launched last April has born unusual fruit. There's a growing understanding that applications in the cloud will be different; that agile development will never quite get to dev ops unless development for the cloud moves into the cloud. Both of these realizations were behind Cloud Foundry being named the best overall developer platform in a recent Evans Data survey of programmers. It had to beat out both Microsoft Azure and IBM's Smart Cloud, both well provisioned with developer tools, as well as Google App Engine with its Gadgets.
Why did it win? Well, for one thing, I think the Evans Data surveys appeal to independent programmers, the ones who are less frequently users of IBM Rational or Microsoft Visual Studio tools (although there are plenty of enterprise programmers using Cloud Foundry). In addition, VMware is scrupulously cultivating an open atmosphere where all are welcome. Cloud Foundry is a staging ground for Spring Framework projects by Java developers. But in mid-December, Tier3 and Uhuru Software contributed .Net Framework support. The Foundry itself is written in Ruby and will also support dynamic languages such as PHP and Python. It's becoming one of the few broadly supportive development platforms where many programming groups might find a home. As it does so, more development moves into the cloud.
3. Finally, The Virtualized Client
Bigger than development, however, is the head of steam building up behind virtualized clients. So far, the success story has been virtualized servers, with the unwashed masses of confused clients lagging far behind. In 2012, that's about to change. Didn't we say that last year? This time, it's real. Big advances are being made in keeping virtualized clients secure, in some cases, lead by Citrix Systems. If virtual desktops are more secure than physical ones, then a major cost justification for the move materializes. A virtual user interface that can move from device to device resolves some of the conflicts preventing the transition to bring-your-own-computer to work. Look for a major ATM supplier, Diebold, to describe how it's using virtual end-user transactions to secure its ATM networks. Personal data can't be stolen (the way it was at supermarkets) if it's not resident on the endpoint transaction machine. If it works on ATMs, it may work on your end-user clients.
4. Cloud Security ... In Depth
In a conversation, Capgemini CTO Joe Coyle made an interesting prediction: "An accepted security model will come to the cloud in 2012." Amazon started 2011 with a new PCI compliance rating that said secure credit card transactions could be executed in EC2. During the year, cloud provider Terremark's NAP of the Americas data center in Miami, Fla., passed the Department of Defense's Information Assurance Certification and Accreditation Process, while Harris implemented its Cyber Integration Center for healthcare data processing, with market-leading defenses in depth. Security can be achieved in the cloud as well as the enterprise data center. "It's a matter of both the clients and the vendors understanding who has responsibility for which pieces," said Coyle. And in 2012, a blueprint for how that's achieved will be laid out.
5. Green Eye Shades Or Shades Of Green?
Builders of new data centers, including Facebook, Google, Amazon, and Microsoft, boast about the new levels of energy efficiency that they've achieved. With melting ice caps, we are not far off from when individual electricity consumption will be monitored with the same intense scrutiny as home consumption, and energy-efficient clouds will appeal to both IT and consumers. Not all applications need 300-watt, high-capacity, and high-speed servers behind them. In some cases, 15 or 20 watts will do. The intense use of mobile devices could in many instances be served by leading edge, energy efficient data centers, perhaps populated with servers like the ones HP announced using Calxeda's ARM chip. Maybe the Cortex chip servers, using 89% less power than conventional ones, wouldn't be the best platform for streaming video of the ballgame you want to see. But they'd be fine for the occasional burst of data telling you what the score is. And they'd keep you out of the red zone of too much energy consumption.
6. Going Over To The Dark Side
In 2012, we will see the first incident where a hacker gets inside a public cloud and produces mischief and mayhem. He will seem to understand its infrastructure, its protective measures and for a time will defy expulsion. CIO Jerry Johnson's account of how a hacker (unidentified but possibly a Chinese intruder) got into Pacific Northwest National Lab is too compelling a story to allow me to believe the public cloud has better defenses than it did. At this stage of cloud computing, too many doors are constantly opening and closing to the public cloud for defenses to cover all eventualities. Defense in depth is needed and is coming, but is not quite sufficiently in place to prevent an incident.
Charles Babcock is an editor-at-large for InformationWeek.
In this Cloud Connect webcast, learn how an automated, secure service in the cloud can remove the burden of managing the complex and error-prone data protection process. It happens Jan. 17.
| }
|