Deep learning maturing in analytics field: Teradata

  • Machines being taught to recognise sensors to advance analytics
  • Need to capitalise by hiring top chief data officer to drive initiatives

 

Deep learning maturing in analytics field: Teradata

 

DEEP learning in artificial intelligence (AI) is beginning to impact the analytics industry in new ways and enterprises must capitalise on this trend in order to differentiate themselves and become more competitive, according to an industry expert.

Teradata Corp chief technology officer (CTO) Stephen Brobst (pic) said several factors have accelerated the use of deep learning in analytics and companies that are able to marry the two technologies together are reaping benefits.

Brobst, who was in Kuala Lumpur visiting customers, cited an example in which some innovative high-tech manufacturers have begun using sound recordings for predictive maintenance scheduling.

Brobst said this is akin to how an experienced human mechanic is able to spot problems in a car engine just by recognising the sound of the problem. By using machine learning, maintenance schedules can be predictive now instead of being cyclical and reactive in nature, Brobst claimed.

“In high-tech manufacturing, there are cases in which recorded sound waves from machines are being fed into a deep learning software that has been taught how to ‘recognise’ when something is wrong,” he explains. “Using this method, the company is able to predict when a breakdown happens before it occurs.”

Machine learning, in which software is taught to learn without being explicitly programmed by humans is a subset of AI. Also known as multi-layer neural networks, deep learning goes beyond machine learning in that it employs a multi-layer, hierarchical method of problem solving, much like the human brain. Problems are solved as it goes through layer after layer until the best output or answer is determined.

Asked what in the industry has changed in the last few years, which has contributed to such advances in analytics, Brobst said there are a few factors in play.

The first, he said, has to do with new developments in the techniques used to make predictions.

“For the last few decades, linear mathematics has been used to do predictive analytics but today non-linear mathematics can be used to give better predictions because software is able to use non-linear data streams such as sound waves.

“This has resulted in systems that are able to analyse a range of sensor data, such as image, text, temperature, vibration, humidity for example, besides sound data.”

Brobst said another factor is breakthroughs in using mapping algorithms to GPU (graphics processing unit) technology using multilayer neural network implementations, or deep learning techniques.

“The theory of deep learning has been around for 50 years but the computation capabilities weren’t feasible then. Only now can we use this technique in a scalable and affordable way,” he said.

Brobst said using GPUs today allows companies to do analytics on a scale and budget not possible before because of the highly parallel processing nature of GPUs,.

GPUs are designed to have many computing processing cores that are optimised for parallel processing – executing very functionally-focused computing tasks simultaneously and in a repetitive manner so that more workloads can be processed all at the same time.

Finally, the mass availability of unstructured, non-linear data from cheap but accurate sensors is much more available today than it was a decade ago. Together, all these factors make it possible for advanced analytics to achieve more today.

Next page: The right skill sets

 

 
Keyword(s) :
 
Author Name :
 
Download Digerati50 2020-2021 PDF

Digerati50 2020-2021

Get and download a digital copy of Digerati50 2020-2021