Cognitive computing systems (CCS) simulate human cognition processes, including self-learning, emotional competence, language understanding, social interaction, interpretation, planning, decision-making and problem solving, as they are integrated with environmental perception and action mechanisms. CCS are designed based on interdisciplinary research disciplines including psychology, machine learning, artificial intelligence, signal processing, computer vision and human-computer interaction. A cognitive computing system is contextual, adaptive, interactive, stateful and reformulates the essence of the relationship between human and ubiquitous digital ecosystem.
As the human being started to communicate with artificial intelligent agents, automatic emotion recognition has become an important field. Majority of the human computer interaction systems are not emotion aware. One of the most important goal of a cognitive system is to be emotion aware, so that it can consider human moods during the conversation and in relation to context it can use emotion state as an input to response process and actions. Artificial emotional intelligence or Emotion AI (Artificial Intelligence) is also known as affective computing, emotion recognition or emotion detection. Emotion AI enables many applications to recognize human emotional states and considers their response or actions based on detected emotions. Emotion AI can be applied and integrated to a rich domain set including, digital assistants, humanoid robots, human-robot interactions, chat bots, recommendation engines, neuromarketing, computer games.
The majority of existing emotion understanding techniques is based on a single modality. The most commonly studied approach is facial expression recognition among single modalities. One reason for this is facial expressions contributes more than vocal and vocal part to the whole message. Another reason is that Ekman states that facial expressions of emotions are universal, and that the expressions of the human face can be classified into categories regardless of race or age. However people have capability to camouflage their emotions during conversation. Using only signals of facial expression have disadvantages: they alone are not reliable to detect emotion, especially if people want to hide their feelings. In order to overcome this disadvantage, physiological, electroencephalography (EEG) and human voice signals have been incorporated into emotional recognition research. EEG and physiological provide “internal” look at emotion processes while images and video sequences give “external” look on the emotion recognition problem.
Artificial emotional intelligence
Artificial emotional intelligence or Emotion AI, also known as affective computing, emotion recognition or emotion detection enables computer-human interface based artificial intelligence applications like digital assistants, humanoid robots,computer assisted tutoring, intelligent interactive emotion aware games, neural signals based marketing (neuromarketing), emotion aware mobile apps, human-robot interactions, chat bots, recommendation engines to recognize human emotional states and considering their responses or actions based on detected emotions.