What are the emerging technologies that will continue to be popular in 2020?

What are the top ten emerging technologies that will drive innovation in the next decade, which will continue to be popular in 2020, in addition to artificial intelligence and machine learning, sensing and mobility, next-generation cognitive computing ,5 G、AR and VR technologies?

No servers and next-generation cloud computing

Today, cloud computing has permeated many aspects of our lives. Whether you realize it or not, most of the data we use in our daily voice communication is stored in the cloud. Serverless computing is a cloud computing execution model in which vendors provide back-end services as used. The server is not cancelled and is still in use, but the company that gets back-end services from no server provider is charged based on usage, not a fixed number of bandwidth or number of servers.

A server-free computing, also known as a function-as-a-service (FaaS), enables companies to build real-time scalable applications so that they can respond to requirements that change instantaneously with order of magnitude. As mentioned earlier, cloud computing provides scalable and unrestricted computing resources for research and reduces the time it takes to bring new products to market.

Natural language processing 

Natural language processing (NLP) is actually a field of artificial intelligence, which enables computers to analyze and understand human language and deal with the interaction between computers and human beings using natural language. Speech to text is the conversion of human language to programming language, while text to speech converts computer operations into sound response.

Natural language processing is widely used in various devices commonly used in our daily life. The emergence of artificial intelligence chip, also known as artificial intelligence accelerator, will further accelerate the development of this technology. For example, a speech assistant like our familiar Siri has a built-in natural language processing engine that converts speech into words, sounds, and ideas.

A little pity that today’s mainstream voice assistant solutions ——Alexa、Siri、Google Home—— not designed to cope with the industrial environment. However, the next generation of natural language processing has been used in industrial Internet of things devices.


Robots may appear longer than you think. The earliest robots we know today were first developed by an inventor from the Louisville of Kentucky largest city George C.Devol.

Many things have changed since the first robots came out in the early 1950s. Robotics is the intersection of science, engineering and technology to produce machines called robots.

Unlike a decade ago, robotics has shifted from industrial use to service and delivery. Robots are affecting homes and businesses in both physical and virtual ways. As mentioned earlier, with the advent of 5 G technology, doctors now use robots for remote surgery. In addition to surgical robots, hospitals and treatment centers are now using robots to improve the quality of care and ensure patient outcomes.

Internet of things

Briefly, the basic idea of the Internet of things (IoT) is to connect any device to the Internet and then connect each device to each other. This device, also known as the Internet of things device, is a piece of hardware with sensors that transmit data from one place to another through the Internet. Internet of things devices include wireless sensors, software, drives and computer devices, and so on.

Unlike the early development of the Internet of things, the next generation of the Internet of things opened a new era of the fourth Industrial Revolution, that is, industry 4.0. Specifically, Industrial 4.0 focuses on intelligent factories that rely on the Internet of things. It affects every industrial process from manufacturing to logistics and supply chain. The Internet of things is one of the nine pillars of industry 4.0.

industry 4.0 nine technology pillars include industrial internet of things, cloud computing, industrial big data, industrial robotics ,3 D printing, automation of knowledge work, industrial network security, virtual reality and artificial intelligence.

Quantum computing

Unlike traditional computers that use 0 or 1 to represent bits to store information, quantum computers use qubits or qubits to encode information as 0,1, or simultaneously as 0 and 1. Quantum computing began in the early 1980s when physicist Paul Benioff (Paul Benioff) first proposed a quantum mechanics model for Turing machines.

Ever since, tech giants like google and IBM have been working to mainstream the technology. In September 2019, a Google researcher published a paper claiming that quantum dominance has been achieved. Google’s quantum processor is said to be able to complete computing that takes abouts state-of-the-art-of-the-art classic computer (Summit) in 3 minutes and 20 seconds.

Although the article, originally published on the National Aeronautics and Space Administration (NASA) website, was soon deleted, it still aroused widespread discussion inside and outside the industry. Whether Google’s statement is true or not, quantum computing will certainly open up new possibilities and help solve previously impossible computing problems.

Leave comment

Your email address will not be published. Required fields are marked with *.