Where will technology take us in 2020?

  • Cloud computing is redefining every aspect of IT
  • Machines cooperate and compete with each other to complete target tasks


Where will technology take us in 2020?


FROM cognitive intelligence, in-memory-computing, fault-tolerant quantum computing, new materials-based semiconductor devices, to faster growth of industrial IoT, large-scale collaboration between machines, production-grade blockchain applications, modular chip design, and AI technologies, we can expect technology advancements and breakthroughs to gain momentum and generate a great impact on our daily lives in the year ahead.

Here are the top 10 technology trends for 2020, as seen by the Alibaba Damo Academy, Alibaba Group’s global research initiative.

Artificial intelligence evolves from perceptual intelligence to cognitive intelligence

Artificial intelligence has reached or surpassed humans in the areas of perceptual intelligence such as speech to text, natural language processing, video understanding etc; but in the field of cognitive intelligence that requires external knowledge, logical reasoning, or domain migration, it is still in its infancy.

Cognitive intelligence will draw inspiration from cognitive psychology, brain science, and human social history, combined with techniques such as cross domain knowledge graph, causality inference, and continuous learning to establish effective mechanisms for the stable acquisition and expression of knowledge. These make machines understand and utilise knowledge, achieving key breakthroughs from perceptual intelligence to cognitive intelligence.

In-Memory-Computing addresses the "memory wall" challenges in AI computing

In Von Neumann architecture, memory and processor are separate and the computation requires data to be moved back and forth. With the rapid development of data-driven AI algorithms in recent years, it has come to a point where the hardware becomes the bottleneck in the explorations of more advanced algorithms.

In Processing-in-Memory (PIM) architecture, in contrast to the Von Neumann architecture, memory and processor are fused together and computations are performed where data is stored with minimal data movement. As such, computation parallelism and power efficiency can be significantly improved. We believe the innovations on PIM architecture are the tickets to next-generation AI.

Industrial IoT powers digital transformations

In 2020, 5G, the rapid development of IoT devices, cloud computing and edge computing will accelerate the fusion of information, communications, and industrial control systems. Through advanced Industrial IoT, manufacturing companies can achieve automation of machines, in-factory logistics, and production scheduling, as a way to realise C2B smart manufacturing.

In addition, interconnected industrial systems can adjust and coordinate the production capability of both upstream and downstream vendors. Ultimately it will significantly increase the manufacturers’ productivity and profitability.

Large-scale collaboration between machines becomes possible

Traditional single intelligence cannot meet the real-time perception and decision needs of large-scale intelligent devices. The development of collaborative sensing technology of between the Internet of Things and 5G communication technology will realise the collaboration among multiple agents -- machines cooperate and compete with each other to complete target tasks.

The group intelligence brought by the cooperation of multiple intelligent bodies will further amplify the value of the intelligent system: large-scale intelligent traffic light dispatching will realise dynamic and real-time adjustment, while warehouse robots will work together to complete cargo sorting more efficiently; driverless cars can perceive the overall traffic conditions on the road, and group unmanned aerial vehicle (UAV) collaboration will get through last -mile delivery more efficiently.

Modular design makes chips easier and faster by stacking chiplets together

Traditional chip design cannot efficiently respond to the fast evolving, fragmented and customised needs of chip production. The open source SoC chip design based on RISC-V, high-level hardware description language, and IP-based modular chip design methods have accelerated the rapid development of agile design methods and the ecosystem of open source chips.

In addition, the modular design method based on chiplets uses advanced packaging methods to package chiplets with different functions together, which can quickly customise and deliver chips that meet specific requirements of different applications.


Where will technology take us in 2020?


Large-scale production-grade blockchain applications will gain mass adoption

BaaS (Blockchain-as-a-Service) will further reduce the barriers of entry for enterprise blockchain applications. A variety of hardware chips embedded with core algorithms used in edge, cloud and designed specifically for blockchain will also emerge, allowing assets in the physical world to be mapped to assets on blockchain, further expanding the boundaries of the Internet of Value and realising "multi-chain interconnection".

In the future, a large number of innovative blockchain application scenarios with multi-dimensional collaboration across different industries and ecosystems will emerge, and large-scale production-grade blockchain applications with more than 10 million DAI (Daily Active Items) will gain mass adoption.

A critical period before large-scale quantum computing

In 2019, the race to reach “Quantum Supremacy” brought the focus back to quantum computing. The demonstration, using superconducting circuits, boosted the overall confidence on superconducting quantum computing for the realisation of a large-scale quantum computer.

In 2020, the field of quantum computing will receive increasing investment, which comes with enhanced competition.

The field is also expected to experience a speed-up in industrialisation and the gradual formation of an ecosystem. In the coming years, the next milestones will be the realisation of fault-tolerant quantum computing and the demonstration of quantum advantages in real-world problems. Either is a great challenge given present knowledge. Quantum computing is entering a critical period.

New materials will revolutionise semiconductor devices

Under the pressure of both Moore's Law and the explosive demand of computing power and storage, it is difficult for classic Si based transistors to maintain sustainable development of the semiconductor industry.

Until now, major semiconductor manufacturers still have no clear answer and option to chips beyond 3nm. New materials will make new logic, storage, and interconnection devices through new physical mechanisms, driving continuous innovation in the semiconductor industry.

For example, topological insulators, two-dimensional superconducting materials, etc. that can achieve the lossless transport of electrons and spin can become the basis for new high-performance logic and interconnect devices; while new magnetic materials and new resistive switching materials can realise high-performance magnetics Memory such as SOT-MRAM and resistive memory.

Growing adoption of AI technologies that protect data privacy

The compliance costs demanded by recent data protection laws and regulations related to data transfer are increasing. In light of this, there has been growing interests in using AI technologies to protect data privacy.

The essence is to enable the data user to compute a function over input data from different data providers while keeping the data private. Such AI technologies promise to solve the problems of data silos and the lack of trust in today's data sharing practices, and will truly unleash the value of data in the foreseeable future.

Cloud becomes the centre of IT technology innovation

With the ongoing development of cloud computing technology, the cloud has grown far beyond the scope of IT infrastructure, and gradually evolved into the center of all IT technology innovations.

Cloud has a close relationship to almost all IT technologies, including new chips, new databases, self-driving adaptive networks, Big Data, AI, IoT, blockchain, quantum computing and so forth.

Meanwhile, it creates new technologies, such as serverless computing, cloud-native software architecture, software-hardware integrated design, as well as intelligent automated operation.

Cloud computing is redefining every aspect of IT, making new IT technologies more accessible for the public. Cloud has become the backbone of the entire digital economy.


Related Stories :

Keyword(s) :
Author Name :
Download Digerati50 2020-2021 PDF

Digerati50 2020-2021

Get and download a digital copy of Digerati50 2020-2021