SIBOS off and running

With SIBOS kicking into high gear this week, it’s hard not to think about the significant changes that have gone on in the banking industry over the past few years. The most obvious reflection of this lies in the key themes of the conference. The idea of ‘rebuilding trust’ addresses how banks can tackle the conflict between reducing risk and bringing down cost while ‘recovery’ looks at leveraging technology to innovate and capitalize on the improving climate. ‘Regulation’ also tops the agenda and will no doubt remain a C-level priority for many years to come.

Tying all of these themes together is the role of technology in helping organizations overcome the obstacles they face. After all, data proliferation, and the need to better manage it, is at the root of many challenges being faced by financial institutions today. For example, our recent survey found that 66% of buy-side firms and 56% of sell-side firms are grappling with siloed data sources.

Looking at some of our recent work, it is interesting to see how financial services firms are trying to tackle these areas. SepSvenska Handelsbanken is using high-performance computing to run core applications for trading and risk management. New regulations around risk in Sweden are leading to more risk scenarios which need to be run so more compute power allows the bank to plan ahead much more easily. Other financial institutions including Citigroup are also running similar projects.

As the event continues through the week, it will be interesting to see the different ways financial organizations are tackling these challenges. Hopefully, we will finally begin to see an end to the stormy crisis clouds and a sunny future shining though.

By Jeff Hong, CFA
Director of Financial Services Industry Marketing
Platform Computing

Calling Dr. Feelgood Again!

On the first day of SIBOS here in cold Amsterdam, I attended the market structures keynote and panel session chaired by Chris Skinner, CEO of Balatro. The session opened amicably with Alan Cameron of BNP Paribas exploring the impacts of the Code of Conduct and T2S initiative on the pan-European Settlement and Clearing landscape. All-in-all, he concluded that the markets are silo’ed and need to change. Either the markets consolidate into one global entity or that they interoperate with one another. His analogy for the prescription for change was that of a doctor presenting three methods for weight loss to a patient.

• Option 1: Go on diet (harmonization via baby steps).
• Option 2: Exercise (harmonization via interoperability).
• Option 3: Liposuction: (harmonization by regulation e.g. T2S). He concluded that all three were needed.

Before you say, “Where is this going?” I’m going to say the above describes eerily the silo’ed infrastructures found in many large FS firms. Despite banks’ greater focus on efficiency, there remain barriers to consolidation for many reasons such as interoperability and regulations. As we have seen here at Platform Computing, when firms finally move to a shared pool of resources, they gain economy of scales and IT agility. And like in the Clearing and Settlement space, the way to a shared service operation will likely be due to a combination of diet, exercise, and nip-tuck. More on that tomorrow.

By Jeff Hong, CFA
Director of Financial Services Industry Marketing
Platform Computing

Innovation in the cloud

During his recent speech at London School of Economics Steve Ballmer said "The cloud will open a whole new range of opportunities to use computing in more valuable ways. There are new applications that you couldn't have built in a world without the cloud."

Cloud really is sparking innovation and the promise of innovation in the cloud is a driver for adoption. This year we conducted a study around the uptake of cloud technology and found that while improving efficiency was the main motivator in 2009 (41%), the 2010 survey reveals that drivers for deploying private cloud included experimenting with cloud (19%).

Private cloud is a good home for test and development because as it offers a flexible, cost effective and safe environment to operate in. We recently wrote about a test and development cloud, this type of private cloud deployment provides a self-service test and development infrastructure.

Not only does cloud offer a platform for innovation, but as Ballmer states it also frees up resources by allowing executives time to innovate.

In order to make the most of the cloud it’s essential to approach a migration to cloud carefully. A phrase we like to use is “A cloud is built not bought” and as with all good constructions; comprehensive plans, firm foundations and the right materials are the essential building blocks.

Private clouds are likely to continue to outpace public cloud models and I expect the increase in private cloud adoption to be matched by an increase in innovation.

Paging “Dr. Feelgood”

Hello I'm Jeff Hong, Financial Services Industry Marketing at Platform Computing, recently we and our partner, SAS, sponsored a survey with Wall Street and Technology (WS&T) of leading capital markets firms on analytics (http://www.grid-analytics.wallstreetandtech.com/index.jhtml). While many of survey's results were par for course, a few findings raised the collective eyebrows here at the company.
Number One "Raise My Brow" Finding: "A significant 36 percent of buy-side firms and 30 percent of sell-side firms run coun­terparty risk analytics only on an ad hoc basis, or not at all". Uh huh... Number Two "Raise My Brow" Finding: "Two-thirds of respondents were not confident that their current analytic platform/infrastructure will continue to keep up over time". Uh Oh!

But won't Dodd Frank and its ilk - designed to minimize risk taken by banks and by implication - require more risk analytics? Well yes and no. There appear to be multiple opinions, some of which include:

• Regulations will dampen enthusiasm for OTC derivatives, favoring exchange traded assets. In this case, risk analytics become simpler.

• Investment banks will divest or shift their prop trading desks to other divisions e.g. asset management. In this case, it's no longer their concern.

• Regulations like Basel III will engender more risk, as banks need to get same return with less capital in play. More risk analytics are needed.

• Compliance will take years, so regulators and banks have time to make adjustments and plans. More or less risk analytics . unclear.

Add in the debate on flash trading, and the future of trading and therefore risk analytics becomes unclear. What is clear is that if your firm analyzes counterparty risk on an ad hoc basis or not at all, this is a real problem.

Accurately calculating your risk exposure - be it counterparty risk, liquidity risk, et. al. - and therefore your firm's health is always a good idea. Good risk practices will set your firm above the current and future debate on risk and regulations. Investing in your analytics infrastructure - be it new servers, better resource management with grid, and smarter data management - should be a priority. According to the same WS&T survey, financial institutions are looking to cluster, grid, and cloud software technologies. Within the next two years, 51 percent of survey respondents are considering or likely to invest in cluster technology, 53 percent are considering or likely to buy grid technology, and 57 percent are considering or likely to purchase cloud technology.

If your firm is planning to or is already doing more analysis, good for you! Otherwise take a couple of those blue or pink pills, and hope the headache goes away.

Long live CERN…

September marks the second birthday of CERN’s Hadron Collider. That makes it two years since the machine was switched on to start uncovering the mysteries of the universe.

For us at Platform Computing, this anniversary is particularly exciting as it also serves as a reminder of the work we are doing to help CERN reach it goals. After all, answering life’s fundamental questions would be impossible without the technology to support it. No human has the capacity to see a particle, let alone weigh it, without some substantial help from the world of IT.

In fact, our work with CERN stretches back well beyond the Hadron Collider. We first started working with the team in the early 1990s when they implemented Platform LSF. The need for scalability and adaptability are key for its success as new projects are being launched all the time and no organization has the resources to bring in new systems every few months to accommodate these.

Over the past two decades, we’ve continued to work with CERN and address their changing demands. We’ve helped develop and implement software to support more jobs and more nodes as the size of its cluster has increased. So, when significant advances like the Hadron Collider are announced and its success demonstrated, we like to feel like we’ve played a small part.If you are a Scientific Computing World subscriber, see page 42 of the October/November 2010 issue, Helge Meinhard, group leader, platform and engineering services, IT Department, CERN talks about overcoming problems faced by just about any HPC center in the world: space, power, cooling and budget and how Platform ISF and LSF are assisting to meeting their challenges.

The Business of IT?

Two sets of recent discussions highlighted the business role of private clouds for me. The first set of conversations was at the Wall Street and Technology roundtable in NYC September 20th. Leading financial institutions gathered to discuss the role and challenges of big analytics. The second set of conversations was with our Platform ISF customers who are building private clouds.

My takeaway from Wall Street and Technology event was that IT silos dominate in areas of high growth or in periods of high growth. In a fast growing business, cost containment is a lower priority and business owners and units have more leeway to build their own systems. To paraphrase one comment: “When our stock is at $100, centralization and consolidation are not big issues, when our stock is at $10, sharing and metering become forefront”.

My takeaway from customer discussions is that virtualization is an operational issue while clouds are business model issue. For organizations that have concluded that the place to start with cloud is private cloud, the choice of a vendor/partner becomes strategic because the business model implications of cost and control are the strategic reason to consider cloud in the first place.

In our blog post “a strategy straight out of Redmond”, we discussed the agendas of each vendor camp. The point is that several of those camps would prefer to make the decision for private cloud to be an operational one more so than a strategic one. For many customers, private clouds may be more of an operational decision but many are seeing cloud for what it is; the chance for a new IT order with the customer in control. A year ago, a long time industry analyst told me “the difference between grid and cloud is that grid was driven by the academics and IT whereas cloud is being driven by the business”. This applies to virtualization as well.

So if we’re in a period of centralization and consolidation for many IT groups and the private cloud management layer is strategic for many firms, the choice of your vendor camp becomes as or more important that the particular vendor that you chose. In other words the business model is likely to drive the architectural model, rather than the other way around.