Financial Services Market Going Strong & Platform Symphony 5.1 Launched

I am happy to report that over the last two quarters, Platform has won several significant multi-million dollar deals in both the capital markets and insurance sectors, and expanded its portfolio of Fortune 500 customers. These landmark sales indicate growing demand for Platform’s products including our cloud management technology, Platform ISF, and Platform’s Symphony, which we are announcing our latest version of today. In fact, Platform now counts many of Wall Street’s largest banks and trading firms among its flagship customers, including 12 of the Bloomberg 20. We also had two significant customer wins in Bloomberg’s top 10 listing in the past fiscal year, which means Platform can now count the majority (six) of the top 10 banks & financial institutions among its customer base!

Today’s release of Platform Symphony 5.1 comes at an important time marked by increased economic and regulatory challenges that are driving organizations to reevaluate their approaches to risk management and compliance. Products and portfolios in both the financial services and enterprise sectors, as well as the sheer amount of complex data organizations need to process has today created a demand for powerful, flexible systems that provide accurate, real-time, counter-party risk assessments. Symphony is an enterprise-class solution that improves efficiency for these distributed service oriented architecture (SOA) applications, enabling a variety of business and financial risk applications, including real-time pricing, value-at-risk and Monte Carlo simulations—all at a lower cost. Platform Symphony also provides distributed computing workload solutions for analytics processing, such as Complex Event Processing (CEP), Extract Transform and Load (ETL) and Business Intelligence.

Platform Symphony 5.1, now with easier application onboarding, GPU support and increased scalability, will be generally available at the end of this month. To read more about its updated features and Platform’s strong showing in FS sales, please check out our press room for the official announcement.

HPC from A-Z (part 11) - K

K is for knowledge mining

As the saying goes, knowledge is power – and who am I to argue?

The more knowledge we have on a subject, situation or person the more-informed the decision making process becomes. For example, if I was looking to purchase a rental property in New York it would be useful for me to know what areas were popular and what prices I could reasonably ask for. Similarly, by checking the dress-code on an invitation I can avoid a potentially embarrassing situation of turning up to a cocktail party in a Halloween costume.

For businesses, the stakes are somewhat raised. A business with knowledge of its customer base is in a much stronger position than one which has no idea who is buying its products or services. Having this type of knowledge provides a competitive advantage for companies looking to understand behaviours and trends, and for large companies it could be worth millions or even billions of dollars in profit or savings.

This isn’t limited to business either; educational institutions such as the University of Oklahoma are using compute-intensive tools that can turn raw data into superior knowledge, allowing them to make informed decisions about research priorities.

Similarly, if you think back to the post we did on the letter E, Renewable Energy Systems was able to find the optimum location to build wind farms based on their knowledge of regional wind patterns.

However, mining business data can be a time-consuming task – and this is where HPC comes in. In order to make the best business decisions possible, companies need to be able to mine and analyse data in real-time: at the speed of business itself.

HPC easily lends itself to the handling of vast volumes of data because it allows for rapid analysis in bulk. The large volumes of information required to provide this competitive edge can often swamp conventional data centers, but an HPC solution provides the support necessary to access, process and analyze numerous data types efficiently and quickly at large volume.

There are no prizes for coming second-place in business, so it makes sense to be in the know.

HPC from A-Z (part 10) - J

J is for jet engine design.

It’s difficult today to imagine a world without air travel. Whether travelling for business or pleasure, the human species has well and truly taken to the skies – and we owe it all to the jet engine.

We’ve come a long way since 1939, when the Heinkel He 178 became the world's first jet-powered aircraft. Back then, jet engines were far less efficient than their piston equivalents and it was thought by many that they wouldn’t amount to much. Well, those early naysayers couldn’t have been more wrong. The 21st Century jet engine is a true masterpiece of design and physics!

However, with stringent safety procedures to follow and the unthinkable cost of producing a faulty model, engine design and testing is a hugely expensive business. A designer of engines since 1925, Pratt and Whitney is at the front of the curve when it comes to innovation. The company relies on computer aided simulations to reduce the enormous price-tag attached to physical engine testing, while ensuring that high standards of testing are maintained throughout.

In order to power such complex simulations, Pratt & Whitney depends on (yes, you guessed right) a high performance computing solution, allowing it to easily explore design alternatives without relying on costly physical testing. The results? Greatly reduced production expense, faster time to market and reduced flight-times, thanks to bigger, faster engines!

So, the next time you’re sunning yourself next to the pool, sipping cocktails, remember the part that HPC has played in getting you there!

HPC from A-Z (part 9) - I

Insurance


Like all financial services companies today, insurance companies (whether national or global in scale) are under constant pressure. The rise of consumer empowerment through the growth of online comparison sites means insurance companies’ websites need to manage increased volume of traffic, requiring additional processing power. Meanwhile, the reporting requirements from the regulatory environment are unrelenting. And this all comes against a backdrop of consumer uncertainty about the economic future and an incredibly competitive industry.


Recently I was impressed with the bold action a major international insurance customer, specializing in retail services, took to plan ahead for these types of challenges. The company wanted to support strategic decision-making around the complex issue of the ‘inherited estate’ by scaling out from a small, overloaded cluster to hundreds of machines. It also needed to manage the workload brought on by the strict UK regulatory environment – PriceWaterhouseCoopers had dauntingly explained it would need a ‘weapon of mass computation’ to achieve this.


HPC gave the company the ability to run up to 10 times more scenario models at once, providing it with the information needed to make a strategic decision not to reallocate inherited estate money. However, the overall strategic value gained is something which the wider insurance industry would benefit from considering.


While this customer’s approach is not unique, there are still many companies in the industry that have not implemented tools that allow them to improve the performance of their compute applications and hardware. And beyond this, there’s even more value to be derived from existing HPC infrastructure: for example, extending modelling to Individual Capital Assessments. These tools help firms identify major sources of risk within their business (credit, market, liquidity, operational and insurance risks) and perform appropriate stress scenario testing for each of the identified risks. Surely this is hugely desirable, if not essential, for the industry at large?

HPC from A-Z (part 8) - H

This blog post will focus on the letter H, and how HPC is being used to crunch data to determine the origins of the universe!


High Energy Physics – When it comes to high energy physics, there is only one name in the frame: CERN. The European Organization for Nuclear Research is using HPC for the Large Hadron Collider. The Large Hadron Collider is a 100 meter long particle accelerator designed to improve our understandings of atoms and the universe. This is quite possibly the most fascinating and imaginative use of HPC. Can you think of one that trumps it?


CERN depends on computing power to ensure that 17,000+ scientists and researchers in 270 research centers in 48 countries can collaborate on a global scale to solve the mysteries of matter and the universe. An HPC environment is central to this, and enables scientist and researchers to quickly analyze the data.