Rising Hadoop use, but deployment challenges remain: Page 2 of 2
By Edwin Yapp August 27, 2013
Lack of skills, understanding
When asked what were the challenges facing enterprises wanting to use Hadoop in their organisations, especially in the South-East Asian region, Teradata’s country manager Craig Morrison says that there is still a lack of skilled people working with Hadoop for it to become mainstream.
“The initial barriers to the adoption of Hadoop for big data analytics are becoming more obvious in South-East Asia as companies begin experimenting, but the lack of data scientists, the need for complex coding and the evolving Hadoop code base is a challenge," he says.
Morrison says some companies in South-East Asia are taking an experimental approach to big data and they are just trying out Hadoop and its role in the big data environment.
“The first obvious application is replacing some of their traditional, aging storage infrastructure so that they can do more in the near future once the everyday business benefits of big data and the role of Hadoop becomes clearer.”
That said, Rojas (pic) remains optimistic about Hadoop's adoption. Recalling the early days of Hadoop, he says a few brave data scientists and senior Java developers had to create Hadoop infrastructure with very primitive resources, from scratch, but those days are long gone.
Today, large numbers of vendors have emerged to bring enterprise capabilities for Hadoop, he says.
Unless customers have very special processing need, usually in the scientific research domain or other extremely complex type of processing, customers should not really consider experimenting with their own software distributions of Apache Hadoop, Rojas argues.
“Entrepreneurial approaches without specialised knowledge has resulted in many Hadoop failures and negative return-on-investments (ROIs) for Hadoop projects,” he explains.
“Today, there are vendors in the market that have enterprise offerings for Hadoop. Teradata in particular has created the Teradata Portfolio for Hadoop as a way for enterprise customers to maximise Hadoop ROI and minimise total cost of data.”
Still, Ovum’s Baer cautions against the thinking that Hadoop is already in mainstream use and highlights several ‘growing pains’ for Hadoop going forward. They include:
- MapReduce (on which Hadoop is based) alternative frameworks are starting to emerge;
- New connections are being made with the SQL world;
- Data security is fragmented, but new tools and technologies are surfacing;
- Packaged analytics tools that reduce or eliminate coding are starting to proliferate; and
- Management capabilities for rationalising Hadoop clusters with data centre infrastructure are only now starting to emerge.
Tool becoming available
In June, Teradata introduced its Teradata Portfolio for Hadoop, a solution which the company claims has made Hadoop easier to integrate into modern enterprise data architectures and enable more users to know more and do more with data in Hadoop.
Rojas says the Teradata Portfolio for Hadoop is targeted at customers who are having challenges with their big data strategy and have to hire expensive and hard-to-find data scientists and big data architects.
“For customers that want integrated product offerings along with strategic and practical hands-on guidance to implement their solution, optimise, and gain knowledge transfer, Teradata provides a single source for all things Hadoop and beyond, as part of the enterprise data architecture.
“Our approach is to provide different product offerings for different customer needs. For example, the Teradata premier appliances for Hadoop will give business value-based IT a turnkey, ready-to-run solution; while the Teradata commodity offerings for Hadoop will equip customers with the resources and skills to deploy software-only solutions, an option to implement Hadoop on their own hardware,” he claims.