What is State-of-the-Art Intelligent Computing?

Where are we currently in the field of intelligent computing? According to experts, we’re moving from an information society to an intelligent community where optimized computing will autonomously solve real-world problems. This is dependent, of course, on the continued development of advanced computing theories and algorithms imparting varying degrees of intelligence to computing systems through autonomous perception, information gathering, analysis, and reasoning.

Many of the world’s top computing experts recently completed the first comprehensive literature survey of the new discipline of intelligent computing, dedicated to complex problem-solving via artificial intelligence (AI). The review focuses on the theory behind intelligent computing and the fusion of intelligence and computing and its potential real-world applications. An international team led by Shiqiang Zhu at the Zhejiang Lab in Hangzhou, China, published the paper in Intelligent Computing.

“Its ultimate goal,” said the authors, “is to provide universal, efficient, secure, autonomous, reliable, and transparent computing services to support large-scale and complex computational tasks.”

CREDIT: INTELLIGENT COMPUTING

 

They identify the biggest hurdle for intelligent computing will be the fusion of intelligence abilities and computation capabilities and innovating the paradigm of ‘computing by intelligence’ and ‘computing for intelligence. AI that uses deep learning still faces significant hurdles in interpretability, generality, evolvability, and autonomy before it’s ready to gain a strong foothold. They claim that AI technologies are not as dynamic or integrative as human intelligence and can perform specialized tasks. Technology challenges to moving from data-based intelligence to a more diverse form of intelligence include perceptual intelligence, cognitive intelligence, autonomous intelligence, and human-machine fusion intelligence.

The amount of computing capacity necessary to power AI applications doubles every 100 days and is predicted to exceed one million times in the next five years, explained the team. Software and hardware will also need to be designed in parallel to move toward more human-like data processing and facilitate these forms of non-linear, large-scale computing.

Leave A Reply

Your email address will not be published.