Visible to the public A multi-layer artificial intelligence and sensing based affective conversational embodied agent

TitleA multi-layer artificial intelligence and sensing based affective conversational embodied agent
Publication TypeConference Paper
Year of Publication2019
AuthorsDiPaola, Steve, Yalçin, Özge Nilay
Conference Name2019 8th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW)
Keywordsaffective computing, artificial intelligence, biosensing, Biosensors, conversation agents, conversational agent, conversational agents, Deep Learning, embodied agent, embodied character agents, emotion recognition, Human Behavior, machine learning, Metrics, pubcrawl, Real-time Systems, Scalability, sensing systems, Streaming media

Building natural and conversational virtual humans is a task of formidable complexity. We believe that, especially when building agents that affectively interact with biological humans in real-time, a cognitive science-based, multilayered sensing and artificial intelligence (AI) systems approach is needed. For this demo, we show a working version (through human interaction with it) our modular system of natural, conversation 3D virtual human using AI or sensing layers. These including sensing the human user via facial emotion recognition, voice stress, semantic meaning of the words, eye gaze, heart rate, and galvanic skin response. These inputs are combined with AI sensing and recognition of the environment using deep learning natural language captioning or dense captioning. These are all processed by our AI avatar system allowing for an affective and empathetic conversation using an NLP topic-based dialogue capable of using facial expressions, gestures, breath, eye gaze and voice language-based two-way back and forth conversations with a sensed human. Our lab has been building these systems in stages over the years.

Citation Keydipaola_multi-layer_2019