Intel announces a New Distribution for Apache Hadoop Software.
For those who unfamiliar the Apache Hadoop software library is a framework that allows for the
distributed processing of large data sets across clusters of computers using
simple programming models. It is designed to scale up from single servers to
thousands of machines, each offering local computation and storage. Rather than
rely on hardware to deliver high-avaiability, the library itself is designed to
detect and
handle failures at the application layer, so delivering a
highly-availabile service on top of a cluster of computers, each of which may
be prone to failures.
The project includes these modules:
- Hadoop Common: The common utilities that support the other Hadoop modules.
- Hadoop Distributed File System (HDFS™): A distributed file system that provides high-throughput access to application data.
- Hadoop YARN: A framework for job scheduling and cluster resource management.
- Hadoop MapReduce: A YARN-based system for parallel processing of large data sets.
The goal via this new release from
Intel is to deliver industry leading performance and security to better access
and use big data.
"People and machines are
producing valuable information that could enrich our lives in so many ways,
from pinpoint accuracy in predicting severe weather to developing customized
treatments for terminal diseases," said Boyd Davis, vice president and
general manager of Intel's Datacenter Software Division.