CHINA / SOCIETY
Exclusive: Chinese top university professor highlights need to research on robots with both emotion and intelligence
Published: Jul 02, 2024 12:27 PM
Gao Yue, Associate Professor at Tsinghua University

Gao Yue, Associate Professor at Tsinghua University

In a pivotal development for artificial intelligence (AI), the integration of emotional intelligence and computational intelligence is emerging as a key direction,Gao Yue, an associate professor of the Chinese prestige Tsinghua University told the Global Times during an exclusive interview. Gao was discussing the topic of research and application of digital human and robots with emotion and intelligence, which has been named one of the top frontier scientific question to work on in 2024 recently released by the China Association of Science and Technology.  

Currently, AI technology primarily focuses on either emotion analysis or intelligent computation, with limited research on combining the two. As AI advances, the standalone capabilities of emotional analysis or computational intelligence are becoming insufficient to meet the demands of human-machine interaction and application. By merging these aspects to create emotionally intelligent systems, AI can facilitate more natural and human-like interactions, enhancing user experience and satisfaction, and expanding the scope and depth of AI applications, making it easier for AI products to integrate into daily life, Gao told the Global Times when explaining the significance of the studies. 

According to Gao, ideal application scenarios for such emotionally and cognitively intelligent digital humans and robots would include personalized medical services where these robots can recognize a patient’s emotional state and provide tailored care and emotional support, enhancing recovery outcomes and mental health. 

In education, they can tailor teaching plans based on a student’s emotions and learning progress, improving learning experiences and outcomes. In the service industry, they offer personalized services such as customized route guidance and alleviating anxiety in unfamiliar environments, thus increasing customer satisfaction and loyalty, Gao elaborated. 

In home settings, these robots can assist with household chores and provide emotional companionship, playing a crucial role in caring for the elderly and children. In industrial automation, they adjust production pace and task allocation based on workers' emotions and work conditions, improving efficiency and satisfaction, he said. 

“Lastly, in national security and defense sector, they provide emotional support and psychological counseling, enhancing soldiers' combat effectiveness and mental resilience."

Currently, model building in this area faces three main research challenges.

First of all, it is multimodal emotion perception. “Effective multimodal emotion perception requires the integration and processing of information from various sensory channels. This involves overcoming inconsistencies and differences between these channels and establishing high-level associations within and between modalities. The goal is to model and analyze emotional representations comprehensively to extract more accurate and complete emotional information,” Gao said.
Second, personalized emotional and cognitive analysis. “Personalized emotion analysis necessitates the development of emotional models tailored to different individuals. This involves mining individual differences from large-scale data and incorporating these into the emotional intelligence system to achieve personalized emotional understanding and analysis. Additionally, it requires accounting for the dynamic changes in emotions over time and across different environments,” he said.

Last it is the biomimetic emotional interaction. “Achieving biomimetic emotional interaction means ensuring that robots and digital humans exhibit sufficient emotional authenticity to foster genuine emotional resonance and connection with users. This challenge involves overcoming unnatural and rigid emotional generation, making emotional expressions more natural and fluid. Besides software algorithms and model design, it also involves hardware considerations, such as perception and sensing technologies, suitable emotional expression devices, and human-computer interaction interfaces to facilitate seamless emotional communication and interaction with intelligent systems.”

In terms of computational power and chip design, new demands and challenges are continuously emerging. For computational power, models need to handle multimodal data from multiple sensors and achieve real-time, accurate human-machine interaction. This requires computational capabilities that support multimodal information fusion, low latency, enhanced learning, and precise reasoning. For chip design, chips for emotionally and cognitively intelligent digital humans and robots must be characterized by high parallelism, low energy consumption, high performance, and real-time capabilities. 

China has made significant progress in affective computing algorithms, particularly in multimedia information processing, speech and text emotion recognition. Numerous research institutions and universities, including Tsinghua University, Harbin Institute of Technology, and Xiamen University, are building large-scale emotional datasets to support the training and optimization of affective computing models. 

They are developing emotion computing methods and tools for various open scenarios. Chinese tech companies like iFlytek and Xiaomi have introduced intelligent customer service robots with emotional interaction capabilities, providing emotional support and natural interactive experiences in the service industry. 

Companies such as Baidu and Alibaba have also developed multimodal emotion recognition systems, offering efficient emotion recognition and generation services in practical applications. Some domestic startups and research teams are exploring the application of emotionally and cognitively intelligent technologies in education and healthcare, developing emotional education assistants and companion robots to enhance user satisfaction and service quality.

Overall, internationally, research on emotionally and cognitively intelligent digital humans and robots has made significant progress in multimodal emotion recognition, emotion generation, and interaction in recent years. Domestic research institutions and enterprises have also achieved important results in affective computing algorithms, emotional interaction systems, and hardware system design. By strengthening international cooperation and interdisciplinary research, Chinese research institutions and enterprises are expected to lead the international technological frontier in emotionally and cognitively intelligent technology, achieving broader applications and development globally.

To tackle this issue, it is necessary to approach it from theoretical foundations, engineering design, and interdisciplinary integration, Gao suggested.

“Universities, research institutes, and tech enterprises need to form a collaborative force to promote the application of research results. From technological development to implementation, a timeframe of 3-5 years is required, with an additional 5-10 years for widespread application and public acceptance of the technology and products in various scenarios," Gao visioned.