Survey Says…
Earlier this week, Platform and independent analyst consulting firm Taneja Group, released the results of a survey we conducted over the summer examining how senior IT managers across a variety of industries are managing the challenges within their Test/Dev environments, as well as the opportunities for server virtualization and cloud computing solutions to address those problems.
Guess what the survey found…
The top challenge facing Test/Dev managers today? Having to manage their virtual and physical resources separately. They also said that virtualization on its own is not addressing their most important infrastructure challenges. Even Gartner reiterated this last week, as my colleague Nick Werstiuk pointed out in his blog http://platformcomputing.blogspot.com/2009/10/clouds-are-everywherebut-where-do-i.html reporting from Gartner’s ITxpo and Symposium.
Enterprises are increasingly relying on shared infrastructures—in our survey, 92 percent of Test/Dev operations are already using shared infrastructures to address operational or budgetary challenges within their departments. But many Test/Dev apps are housed both on physical and virtual machines, so not only does IT have to allocate the workload sharing among the departments, but also among different types of machines. Herein lies the problem!
Although virtualization is helping them to better enable their environments, it often adds several layers of control and cost issues that also must be addressed in order to fully integrate the capabilities of all the machines—physical or virtual. The virtual layer doesn’t include sharing, process, workflow or other management capabilities the departments need.
All of this is important because many companies are using Test/Dev departments as their first forays into private cloud computing. And all of these elements—sharing, process, workflow, management and, yes, virtualization—are crucial to building private clouds.
I’ll leave you with two other proof points from the survey. The use of private clouds is indeed underway in many Test/Dev environments. Despite all the industry hoopla and denials, companies are actually already using this stuff—and they’re not only using it for Test/Dev, but also for their production environments. Seventy-six percent of our survey respondents are already using shared infrastructures or private clouds for both Test/Dev and production applications, such as CRM, Finance and Web applications. They’re also doing this primarily within their own firewalls because they don’t trust the lack of control and immature technology that public clouds currently offer. As a result, 82 percent of respondents said they don’t use hosted solutions, only private ones.
The clouds are coming, and they’re coming fast—is your enterprise ready?
For more information on the press release and the survey, please visit:
http://searchcloudcomputing.bitpipe.com/detail/RES/1256567423_786.html
www.platform.com/eforums/eforum.asp?1-1K3AU7
Clouds are Everywhere…But Where Do I Start?
I just returned from four very interesting and insightful days at Gartner Symposium and IT Expo in Orlando.
It was remarkable to see how quickly the phrase “Cloud Computing” has entered the mainstream IT vocabulary. Clouds were everywhere at the symposium--Public, Private, Hybrid, and many other types were presented, discussed and dissected. If you want a sense of the velocity of cloud, I re-checked the Symposium agenda from 2008 where there were only a few sessions about cloud and application development, but nowhere near the quantity and quality of cloud content of this year’s symposium.
There has been lots of coverage already about Gartner’s proclamation that Cloud is the #1 strategic technology that IT organizations need to look at for 2010, so I won’t rehash that theme here. But I will try to provide some perspective from an attendee that sat in on all the key cloud sessions and had the opportunity to participate in six separate one-on-one sessions this week with leading Gartner analysts who are looking at cloud from an infrastructure and IT operations perspective. (I also won’t start a debate right now on whether cloud is a technology, style of computing, or IT business model. We’ll save that for a later posting!)
Out of the four plus days of content from the conference, I’ll summarize what stuck with me down to a few key points:
- Cloud is REAL for Enterprise IT, and in particular Private Cloud is key. A couple of data points beyond the hype of the term---in one session on “Server Virtualization: Emerging to Mainstream at Lightspeed” by Tom Bittman, an informal poll was taken from the 300 or so attendees asking “Who in the audience is looking at building a private cloud?” I estimate at least 80-90% of the audience raised their hands (which I think surprised a lot of people). In a second session, Cameron Haight provided a strategic planning assumption that more money will be spent by the enterprise on private (vs. public) cloud through 2012.
- Bottom line: The private cloud market will happen very quickly as momentum builds, and enterprise organizations need to really start now on their thinking about private cloud, and they need to think about it at both an architectural and experimental level.
- At an architectural level – your private cloud is the property of the enterprise, not the property of a single infrastructure or application vendor (like some vendors may lead you to believe). It is critical that organizations think strategically about their cloud architecture so that they retain control and ensure they have the flexibility to plug-in the best technologies and solutions into their cloud strategy as the market unfolds. Protect yourself from building vendor specific clouds.
- At the experimental level, cloud does not equal virtualization (said many times during the week), so as enterprises look to investigate their cloud, it is essential to initiate a cloud project (not just virtualization, not just automation) to explore the technologies and solutions that support their strategic thinking around cloud architecture. You can get started now, but make sure your experiments align with the overall architectural strategy.
Overall, the conference was very worthwhile, and I would recommend attendance for enterprise architects and CIOs who are wrestling with key issues in moving their IT business forward. I look forward to next year’s conference to see how all the cloud hype has been turned into reality for large enterprise IT departments.
Private Clouds, Application Awareness and Why Virtualization is Not Enough…
Given the huge amount of conflicting opinion regarding the future and meaning of cloud it was refreshing this week to take part in a panel discussion in which some degree of consensus was reached. Speaking at Waters Power in London – an event which explores IT challenges facing the financial services sector – I was joined by senior IT executives from tier one banks to discuss the costs, risks and benefits of cloud.
Participants on the panel said they were excited about cloud and were actively looking at ways in which it can benefit their organizations. This in itself was significant since most of the talk at last year’s event focused on grid and virtualization rather than cloud. More interesting, however, was the general agreement on where cloud is heading. The panel believed that while there’s no doubt public cloud services are the future of IT we’re some way off reaching that point.
For now, private cloud is where firms are focusing most of their attention. This is particularly the case as organizations remain cautious about the security of their data when stored beyond the corporate network. Latency too was felt to be a key barrier to public cloud adoption and it was agreed more needs to be done to ensure applications are suitable for the cloud.
The issue of economics was also raised by the panel. Despite the widely touted cost saving benefits of cloud services, organizations are reluctant to pay others to host their data when they have already invested significant sums of money in their own IT infrastructure, which everyone agreed was greatly underutilized. Getting the most out of existing investments is therefore crucial and a key challenge, hence why we developed Platform ISF – to bring the same types of cost savings and higher utilization benefits Platform Symphony has delivered for HPC workloads, to the rest of the data center. ISF is currently in beta release at many of these same customers and will be GA in November.
Besides the fact the private cloud is the main focus right now, the other two key takeaways from the conversation with the panel and with several senior IT executives I meet with during my visit to both Paris and London this past week was around “application awareness” in the cloud and that “virtualization is not enough”.
On the first topic, it was clear from the audience questions that they were concerned where application and their sub-components would be placed in the cloud when deployed to this dynamic infrastructure within VMs. They wanted the cloud “engine” to do automated placement that considered some type of “affinity” or closeness of where these deployments occurred relative to other components or to storage, networks, etc. This is something Platform ISF already has included in its policy placement engine as noted in the ISF data sheet, so I guess we are on the right track.
As far as virtualization not being enough, the issue is that many firms do “static placement” of VMs, which only helped slightly in increasing infrastructure utilization and lowering costs. Due to this static placement, application VMs are being over provisioned and that meant less VMs can be “packed” into one server than was originally thought could be done. The reason why these were statically placed and not generally moved unless a VM becomes “hot” and needs more CPU than is available is that it normally required a system admin to manually do the migration. This is because few companies trusted the VM Management software to automatically migrate the VMs, since this software lacked the application awareness and intelligence that was discussed earlier in this post.
Expect to see more financial services firms making the step beyond grid in the months ahead as the efficiency and cost saving benefits of private cloud become too tempting to ignore. As one participant commented, even in a downturn banks need to innovate. Experimenting with private cloud and establishing a flexible and scalable IT infrastructure can support innovation by helping firms get products to market quicker than ever. On this point alone there’s little disagreement this is hugely desirable in today’s economic climate.
Welcome to our new corporate blog
My role is to help engage customers and partners in defining what cloud computing means:. What it means to each individual organization, the technology industry, IT architecture and models, the marketplace, and the bottom-line. With that in mind, I’ll try to articulate a few of Platform’s points-of-view on what’s happening to get the dialog started.
Clusters, grids, clouds, whatever. I’ve been in the industry long-enough to have seen many cycles and the reality that follows them —good and bad. As always, change is sparked by a revolution and becomes mainstream through evolution. Amazon and Google have put a bright light on the deficiencies of current IT models through their so-called “public” cloud computing offerings. Similarly, the emergence of “private clouds” as a term reflects the immaturity of those services. The point is that cloud computing is real, despite its hype, and will borrow heavily from today’s cluster and grid environments. Like all trends and terms, cloud will eventually give way to “whatever” comes next and leave in its wake the fundamental technical and business challenges that remain constant.
Let’s gets beyond the hype. At Platform, our goal is to discuss the “whatever” that is your unique and heterogeneous environment and “whatever” you envision as its future state . Specifically what are the application workload, resource pool and service level management lessons that cloud computing will teach us again?