Angelica Lim

According to our database1, Angelica Lim authored at least 51 papers between 2010 and 2024.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
React to This! How Humans Challenge Interactive Agents using Nonverbal Behaviors.
CoRR, 2024

Past, Present, and Future: A Survey of The Evolution of Affective Robotics For Well-being.
CoRR, 2024

Mmm whatcha say? Uncovering distal and proximal context effects in first and second-language word perception using psychophysical reverse correlation.
CoRR, 2024

Predicting Long-Term Human Behaviors in Discrete Representations via Physics-Guided Diffusion.
CoRR, 2024

Contextual Emotion Recognition using Large Vision Language Models.
CoRR, 2024

Good Things Come in Trees: Emotion and Context Aware Behaviour Trees for Ethical Robotic Decision-Making.
CoRR, 2024

EmoStyle: One-Shot Facial Expression Editing Using Continuous Emotion Parameters.
Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, 2024

2023
MotionScript: Natural Language Descriptions for Expressive 3D Human Motions.
CoRR, 2023

Emotional Theory of Mind: Bridging Fast Visual Processing with Slow Linguistic Reasoning.
CoRR, 2023

The dynamic nature of trust: Trust in Human-Robot Interaction revisited.
CoRR, 2023

Read the Room: Adapting a Robot's Voice to Ambient and Social Contexts.
IROS, 2023

An MCTS-DRL Based Obstacle and Occlusion Avoidance Methodology in Robotic Follow-Ahead Applications.
IROS, 2023

I'm a Robot, Hear Me Speak!
Proceedings of the Companion of the 2023 ACM/IEEE International Conference on Human-Robot Interaction, 2023

Contextual Emotion Estimation from Image Captions.
Proceedings of the 11th International Conference on Affective Computing and Intelligent Interaction, 2023

2022
Read the Room: Adapting a Robot's Voice to Ambient and Social Contexts.
CoRR, 2022

Data-driven emotional body language generation for social robotics.
CoRR, 2022

Gesture2Vec: Clustering Gestures using Representation Learning Methods for Co-speech Gesture Generation.
Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, 2022

Towards Inclusive HRI: Using Sim2Real to Address Underrepresentation in Emotion Expression Recognition.
Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, 2022

Human Navigational Intent Inference with Probabilistic and Optimal Approaches.
Proceedings of the 2022 International Conference on Robotics and Automation, 2022

Inclusive HRI: Equity and Diversity in Design, Application, Methods, and Community.
Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction, 2022

2021
A Deeper Look at Autonomous Vehicle Ethics: An Integrative Ethical Decision-Making Framework to Explain Moral Pluralism.
Frontiers Robotics AI, 2021

A Multimodal and Hybrid Framework for Human Navigational Intent Inference.
Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, 2021

Perceptual Effects of Ambient Sound on an Artificial Agent's Rate of Speech.
Proceedings of the Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, 2021

Developing a Data-Driven Categorical Taxonomy of Emotional Expressions in Real World Human Robot Interactions.
Proceedings of the Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, 2021

The Many Faces of Anger: A Multicultural Video Dataset of Negative Emotions in the Wild (MFA-Wild).
Proceedings of the 16th IEEE International Conference on Automatic Face and Gesture Recognition, 2021

Children, Robots, and Virtual Agents: Present and Future Challenges.
Proceedings of the IDC '21: Interaction Design and Children, 2021

2020
SFU-Store-Nav: A Multimodal Dataset for Indoor Human Navigation.
CoRR, 2020

Generating Robotic Emotional Body Language of Targeted Valence and Arousal with Conditional Variational Autoencoders.
Proceedings of the Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, 2020

2019
Challenges in Distributed Agile Software Development Environment: A Systematic Literature Review.
KSII Trans. Internet Inf. Syst., 2019

Towards an EmoCog Model for Multimodal Empathy Prediction.
Proceedings of the 14th IEEE International Conference on Automatic Face & Gesture Recognition, 2019

Commodifying Pointing in HRI: Simple and Fast Pointing Gesture Detection from RGB-D Images.
Proceedings of the 16th Conference on Computer and Robot Vision, 2019

Investigating Positive Psychology Principles in Affective Robotics.
Proceedings of the 8th International Conference on Affective Computing and Intelligent Interaction, 2019

Generating robotic emotional body language with variational autoencoders.
Proceedings of the 8th International Conference on Affective Computing and Intelligent Interaction, 2019

The OMG-Empathy Dataset: Evaluating the Impact of Affective Behavior in Storytelling.
Proceedings of the 8th International Conference on Affective Computing and Intelligent Interaction, 2019

2018
How does the robot feel? Perception of valence and arousal in emotional body language.
Paladyn J. Behav. Robotics, 2018

HRI 2018 Workshop: Social Robots in the Wild.
Proceedings of the Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, 2018

2017
UE-HRI: a new dataset for the study of user engagement in spontaneous human-robot interactions.
Proceedings of the 19th ACM International Conference on Multimodal Interaction, 2017

Gaze and filled pause detection for smooth human-robot conversations.
Proceedings of the 17th IEEE-RAS International Conference on Humanoid Robotics, 2017

2016
Habit detection within a long-term interaction with a social robot: an exploratory study.
Proceedings of the International Workshop on Social Learning and Multimodal Interaction for Designing Artificial Agents, 2016

International workshop on social learning and multimodal interaction for designing artificial agents (workshop summary).
Proceedings of the 18th ACM International Conference on Multimodal Interaction, 2016

2015
A Recipe for Empathy - Integrating the Mirror System, Insula, Somatosensory Cortex and Motherese.
Int. J. Soc. Robotics, 2015

2014
MEI: multimodal emotional intelligence.
PhD thesis, 2014

The MEI Robot: Towards Using Motherese to Develop Multimodal Emotional Intelligence.
IEEE Trans. Auton. Ment. Dev., 2014

Making a robot dance to diverse musical genre in noisy environments.
Proceedings of the 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2014

2012
Musical Robots and Interactive Multimodal Systems.
Int. J. Synth. Emot., 2012

Towards expressive musical robots: a cross-modal framework for emotional gesture, voice and music.
EURASIP J. Audio Speech Music. Process., 2012

A multimodal tempo and beat-tracking system based on audiovisual information from live guitar performances.
EURASIP J. Audio Speech Music. Process., 2012

A Musical Robot that Synchronizes with a Coplayer Using Non-Verbal Cues.
Adv. Robotics, 2012

Using Speech Data to Recognize Emotion in Human Gait.
Proceedings of the Human Behavior Understanding - Third International Workshop, 2012

2011
Converting emotional voice to motion for robot telepresence.
Proceedings of the 11th IEEE-RAS International Conference on Humanoid Robots (Humanoids 2011), 2011

2010
Robot musical accompaniment: integrating audio and visual cues for real-time synchronization with a human flutist.
Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2010


  Loading...