Encoding of multi-modal emotional information via personalized skin-integrated wireless facial interface
Human affects such as emotions, moods, feelings are increasingly being considered as key parameter to enhance the interaction of human with diverse machines and systems. However, their intrinsically abstract and ambiguous nature make it challenging to accurately extract and exploit the emotional inf...
Saved in:
Main Authors: | Lee, Jin Pyo, Jang, Hanhyeok, Jang, Yeonwoo, Song, Hyeonseo, Lee, Suwoo, Lee, Pooi See, Kim, Jiyun |
---|---|
Other Authors: | School of Materials Science and Engineering |
Format: | Article |
Language: | English |
Published: |
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/174702 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Similar Items
-
Facial Expression Aftereffect Revealed by Adaption to Emotion-Invisible Dynamic Bubbled Faces
by: Luo, Chengwen, et al.
Published: (2016) -
The relationship between the affective components of alexithymia and facial recognition and expression of emotion
by: LIU LI JUAN DENISE
Published: (2010) -
A novel application of self-organizing network for facial expression recognition from radial encoded contours
by: Gu, W.F., et al.
Published: (2014) -
Side profile facial recognition using CNN
by: Varthamanan Manisha
Published: (2024) -
Active affective facial analysis for human-robot interaction
by: Ge, S.S., et al.
Published: (2014)