Challenge #4: Lack of Quality of Service
We are back after a short break. The challenge with the current implementation of Hadoop MapReduce continues. In this blog let’s take a look at the fourth challenge in the existing Hadoop stack – the lack of quality of service.
By high quality of service, we are referring to the capability of dynamically allocating available IT infrastructure based on workloads requirements, maximizing resource utilization and preventing silos. Those capabilities lead to better application performance and faster time to results, and therefore, provide high return on investment for the IT organization. The current architecture design of the existing open source Hadoop stack puts limitations on the above capabilities. As mentioned in part 2 of this blog series, the single job tracker in the current Hadoop implementation is not separated from the resource manager, so as a result, the job tracker does not provide sufficient resource management functionalities to allow dynamic lending and borrowing of available IT resources. This creates a static IT environment in which each Hadoop application can only run on a pre-assigned set of resources at a given time and no exceptions are allowed. As the requirements of the application changes, resources will need to be re-configured manually to meet new demand. Such a static IT infrastructure brings the following issues for an IT organization:
· Unable to provide the necessary and guaranteed services to multiple lines of businesses
· Unable to manage real-time workload requirements
· Slower performance and time to discovery
· Increased management complexity
The result? Underutilized resources and a higher total cost of ownership for IT.
In contrast to a static IT infrastructure, a sophisticated runtime built on a service oriented architecture (SOA) evolution. brings quality of service to IT organizations committed to providing high quality services to their internal and/or external clients. Such a runtime solution will help transform IT into a true service provider and help meet demanding requirements (-high availability, dynamic resource allocations, ease of management, etc.) in the production environment. As new technologies such as Hadoop and MapReduce continue their penetration into the mainstream market, more applications will be developed and moved into production. Quality of service will undoubtedly become a critical consideration for IT in the next wave of the Big Data
Please join us at SC11 for a free breakfast briefing: “Overcoming Your MapReduce Barriers”. Register today to secure your spot!