K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations


  • 1.

    Salovey, P. & Mayer, J. D. Emotional intelligence. Imagination, Cogn. Pers. 9, 185–211 (1990).


    Google Scholar
     

  • 2.

    Mayer, J. D., Caruso, D. R. & Salovey, P. Emotional intelligence meets traditional standards for an intelligence. Intell. 27, 267–298 (1999).


    Google Scholar
     

  • 3.

    Salovey, P. E. & Sluyter, D. J. Emotional development and emotional intelligence: educational implications. (Basic Books, 1997).

  • 4.

    Lopes, P. N. et al. Emotional intelligence and social interaction. Pers. Soc. Psychol. Bull. 30, 1018–1034 (2004).

    PubMed 

    Google Scholar
     

  • 5.

    Esteva, A. et al. Dermatologist-level classification of skin cancer with deep neural networks. Nat. 542, 115–118 (2017).

    ADS 
    CAS 

    Google Scholar
     

  • 6.

    Mastoras, R.-E. et al. Touchscreen typing pattern analysis for remote detection of the depressive tendency. Sci. Reports 9, 1–12 (2019).

    CAS 

    Google Scholar
     

  • 7.

    Yurtsever, E., Lambert, J., Carballo, A. & Takeda, K. A survey of autonomous driving: common practices and emerging technologies. IEEE Access 8, 58443–58469 (2020).


    Google Scholar
     

  • 8.

    Pennachin, C. & Goertzel, B. Contemporary approaches to artificial general intelligence. In Artificial General Intelligence, 1–30 (Springer, 2007).

  • 9.

    Silver, D. et al. Mastering the game of go with deep neural networks and tree search. Nat. 529, 484 (2016).

    ADS 
    CAS 

    Google Scholar
     

  • 10.

    Silver, D. et al. Mastering the game of go without human knowledge. Nat. 550, 354–359 (2017).

    ADS 
    CAS 

    Google Scholar
     

  • 11.

    Reeves, B. & Nass, C. I. The media equation: how people treat computers, television, and new media like real people and places. (Cambridge University Press, 1996).

  • 12.

    Turpen, A. Mit wants self-driving cars to traffic in human emotion. New Atlas, https://newatlas.com/automotive/mit-self-driving-cars-human-emotion/ (2019).

  • 13.

    Barrett, L. F. How emotions are made: the secret life of the brain (Houghton Mifflin Harcourt, 2017).

  • 14.

    Du, S., Tao, Y. & Martinez, A. M. Compound facial expressions of emotion. Proc. Natl. Acad. Sci. 111, E1454–E1462 (2014).

    ADS 
    CAS 
    PubMed 

    Google Scholar
     

  • 15.

    Yannakakis, G. N., Cowie, R. & Busso, C. The ordinal nature of emotions. In 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII), 248–255 (IEEE, 2017).

  • 16.

    Frank, M. G. & Svetieva, E. Microexpressions and deception. In Understanding Facial Expressions in Communication, 227–242 (Springer, 2015).

  • 17.

    Barrett, L. F., Adolphs, R., Marsella, S., Martinez, A. M. & Pollak, S. D. Emotional expressions reconsidered: challenges to inferring emotion from human facial movements. Psychol. Sci. Public Interest 20, 1–68 (2019).

    CAS 
    PubMed 
    PubMed Central 

    Google Scholar
     

  • 18.

    Carroll, J. M. & Russell, J. A. Do facial expressions signal specific emotions? judging emotion from the face in context. J. Pers. Soc. Psychol. 70, 205 (1996).

    CAS 
    PubMed 

    Google Scholar
     

  • 19.

    Cauldwell, R. T. Where did the anger go? the role of context in interpreting emotion in speech. In ISCA Tutorial and Research Workshop (ITRW) on Speech and Emotion (2000).

  • 20.

    Barrett, L. F., Mesquita, B. & Gendron, M. Context in emotion perception. Curr. Dir. Psychol. Sci. 20, 286–290 (2011).


    Google Scholar
     

  • 21.

    Larsen, R. J. & Diener, E. Affect intensity as an individual difference characteristic: a review. J. Res. Pers. 21, 1–39 (1987).


    Google Scholar
     

  • 22.

    Gross, J. J. & John, O. P. Individual differences in two emotion regulation processes: implications for affect, relationships, and well-being. J. Pers. Soc. Psychol. 85, 348 (2003).

    PubMed 

    Google Scholar
     

  • 23.

    Soleymani, M., Lichtenauer, J., Pun, T. & Pantic, M. A multimodal database for affect recognition and implicit tagging. IEEE Transactions on Affect. Comput. 3, 42–55 (2011).


    Google Scholar
     

  • 24.

    Koelstra, S. et al. Deap: a database for emotion analysis; using physiological signals. IEEE Transactions on Affect. Comput. 3, 18–31 (2011).


    Google Scholar
     

  • 25.

    Abadi, M. K. et al. Decaf: meg-based multimodal database for decoding affective physiological responses. IEEE Transactions on Affect. Comput. 6, 209–222 (2015).


    Google Scholar
     

  • 26.

    Subramanian, R. et al. Ascertain: emotion and personality recognition using commercial sensors. IEEE Transactions on Affect. Comput. 9, 147–160 (2016).


    Google Scholar
     

  • 27.

    Katsigiannis, S. & Ramzan, N. Dreamer: a database for emotion recognition through eeg and ecg signals from wireless low-cost off-the-shelf devices. IEEE J. Biomed. Heal. Informatics 22, 98–107 (2017).


    Google Scholar
     

  • 28.

    Correa, J. A. M., Abadi, M. K., Sebe, N. & Patras, I. Amigos: a dataset for affect, personality and mood research on individuals and groups. IEEE Transactions on Affect. Comput., https://doi.org/10.1109/TAFFC.2018.2884461 (2018).

  • 29.

    Sharma, K., Castellini, C., van den Broek, E. L., Albu-Schaeffer, A. & Schwenker, F. A dataset of continuous affect annotations and physiological signals for emotion analysis. Sci. Data 6, 1–13 (2019).

    CAS 

    Google Scholar
     

  • 30.

    Yan,W.-J.,Wu, Q., Liu, Y.-J.,Wang, S.-J. & Fu, X. Casme database: a dataset of spontaneous micro-expressions collected from neutralized faces. In 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG), 1–7 (IEEE, 2013).

  • 31.

    Schmidt, P., Reiss, A., Duerichen, R., Marberger, C. & Van Laerhoven, K. Introducing wesad, a multimodal dataset for wearable stress and affect detection. In Proceedings of the 20th ACM International Conference on Multimodal Interaction, 400–408 (2018).

  • 32.

    Watson, D. Mood and temperament (Guilford Press, 2000).

  • 33.

    Batliner, A., Fischer, K., Huber, R., Spilker, J. & Nöth, E. How to find trouble in communication. Speech Commun. 40, 117–143 (2003).

    MATH 

    Google Scholar
     

  • 34.

    Henrich, J., Heine, S. J. & Norenzayan, A. The weirdest people in the world? Behav. Brain Sci. 33, 61–83 (2010).

    PubMed 

    Google Scholar
     

  • 35.

    Dhall, A., Goecke, R., Lucey, S. & Gedeon, T. Collecting large, richly annotated facial-expression databases from movies. IEEE Multimed. 34–41 (2012).

  • 36.

    Mollahosseini, A., Hasani, B. & Mahoor, M. H. Affectnet: a database for facial expression, valence, and arousal computing in the wild. IEEE Transactions on Affect. Comput. 10, 18–31 (2017).


    Google Scholar
     

  • 37.

    McDuff, D., Amr, M. & El Kaliouby, R. Am-fed+: an extended dataset of naturalistic facial expressions collected in everyday settings. IEEE Transactions on Affect. Comput. 10, 7–17 (2018).


    Google Scholar
     

  • 38.

    Poria, S. et al. Meld: a multimodal multi-party dataset for emotion recognition in conversations. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 527–536 (2019).

  • 39.

    McDuff, D., El Kaliouby, R. & Picard, R. W. Crowdsourcing facial responses to online videos. IEEE Transactions on Affect. Comput. 3, 456–468 (2012).


    Google Scholar
     

  • 40.

    Morris, R., McDuff, D. & Calvo, R. Crowdsourcing techniques for affective computing. In The Oxford Handbook of Affective Computing, 384–394 (Oxford Univ. Press, 2014).

  • 41.

    Korovina, O., Baez, M. & Casati, F. Reliability of crowdsourcing as a method for collecting emotions labels on pictures. BMC Res. Notes 12, 1–6 (2019).


    Google Scholar
     

  • 42.

    Motley, M. T. & Camden, C. T. Facial expression of emotion: a comparison of posed expressions versus spontaneous expressions in an interpersonal communication setting. West. J. Speech Commun. 52, 1–22 (1988).


    Google Scholar
     

  • 43.

    Jürgens, R., Grass, A., Drolet, M. & Fischer, J. Effect of acting experience on emotion expression and recognition in voice: non-actors provide better stimuli than expected. J. Nonverbal Behav. 39, 195–214 (2015).

    PubMed 
    PubMed Central 

    Google Scholar
     

  • 44.

    Juslin, P. N., Laukka, P. & Bänziger, T. The mirror to our soul? comparisons of spontaneous and posed vocal expression of emotion. J. Nonverbal Behav. 42, 1–40 (2018).

    PubMed 

    Google Scholar
     

  • 45.

    Cacioppo, J. T. et al. The psychophysiology of emotion. Handb. Emot. 2, 173–191 (2000).


    Google Scholar
     

  • 46.

    Picard, R. W., Vyzas, E. & Healey, J. Toward machine emotional intelligence: analysis of affective physiological state. IEEE Transactions on Pattern Analysis Mach. Intell. 23, 1175–1191 (2001).


    Google Scholar
     

  • 47.

    Lisetti, C. L. & Nasoz, F. Using noninvasive wearable computers to recognize human emotions from physiological signals. EURASIP J. on Adv. Signal Process. 2004, 929414 (2004).


    Google Scholar
     

  • 48.

    Rainville, P., Bechara, A., Naqvi, N. & Damasio, A. R. Basic emotions are associated with distinct patterns of cardiorespiratory activity. Int. J. Psychophysiol. 61, 5–18 (2006).

    PubMed 

    Google Scholar
     

  • 49.

    Nummenmaa, L., Glerean, E., Hari, R. & Hietanen, J. K. Bodily maps of emotions. Proc. Natl. Acad. Sci. 111, 646–651 (2014).

    ADS 
    CAS 
    PubMed 

    Google Scholar
     

  • 50.

    Pace-Schott, E. F. et al. Physiological feelings. Neurosci. & Biobehav. Rev. 103, 267–304 (2019).


    Google Scholar
     

  • 51.

    Busso, C. et al. Iemocap: interactive emotional dyadic motion capture database. Lang. Resour. Eval. 42, 335 (2008).


    Google Scholar
     

  • 52.

    McKeown, G., Valstar, M., Cowie, R., Pantic, M. & Schroder, M. The semaine database: annotated multimodal records of emotionally colored conversations between a person and a limited agent. IEEE Transactions on Affect. Comput. 3, 5–17 (2011).


    Google Scholar
     

  • 53.

    Busso, C. et al. Msp-improv: an acted corpus of dyadic interactions to study emotion perception. IEEE Transactions on Affect. Comput. 8, 67–80 (2016).


    Google Scholar
     

  • 54.

    Healey, J. Recording affect in the field: towards methods and metrics for improving ground truth labels. In Affective Computing and Intelligent Interaction, 107–116 (Springer, 2011).

  • 55.

    Zhang, B., Essl, G. & Mower Provost, E. Automatic recognition of self-reported and perceived emotion: does joint modeling help? In Proceedings of the 18th ACM International Conference on Multimodal Interaction, 217–224 (2016).

  • 56.

    Truong, K. P., van Leeuwen, D. A. & Neerincx, M. A. Unobtrusive multimodal emotion detection in adaptive interfaces: speech and facial expressions. In International Conference on Foundations of Augmented Cognition, 354–363 (Springer, 2007).

  • 57.

    Grossman, J. B., Klin, A., Carter, A. S. & Volkmar, F. R. Verbal bias in recognition of facial emotions in children with asperger syndrome. The J. Child Psychol. Psychiatry Allied Discip. 41, 369–379 (2000).

    CAS 

    Google Scholar
     

  • 58.

    Dickson, H., Calkins, M. E., Kohler, C. G., Hodgins, S. & Laurens, K. R. Misperceptions of facial emotions among youth aged 9–14 years who present multiple antecedents of schizophrenia. Schizophr. Bull. 40, 460–468 (2014).

    PubMed 

    Google Scholar
     

  • 59.

    Truong, K. P., Van Leeuwen, D. A. & De Jong, F. M. Speech-based recognition of self-reported and observed emotion in a dimensional space. Speech Commun. 54, 1049–1063 (2012).


    Google Scholar
     

  • 60.

    Hess, U., Blairy, S. & Kleck, R. E. The intensity of emotional facial expressions and decoding accuracy. J. Nonverbal Behav. 21, 241–257 (1997).


    Google Scholar
     

  • 61.

    Ranganathan, H., Chakraborty, S. & Panchanathan, S. Multimodal emotion recognition using deep learning architectures. In 2016 IEEE Winter Conference on Applications of Computer Vision (WACV), 1–9 (IEEE, 2016).

  • 62.

    Min, H. C. & Nam, T.-J. Biosignal sharing for affective connectedness. In CHI ’14 Extended Abstracts on Human Factors in Computing Systems, 2191–2196 (2014).

  • 63.

    Hassib, M., Buschek, D., Wozniak, P. W. & Alt, F. Heartchat: heart rate augmented mobile chat to support empathy and awareness. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, 2239–2251 (2017).

  • 64.

    Liu, F., Dabbish, L. & Kaufman, G. Supporting social interactions with an expressive heart rate sharing application. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 1, 1–26 (2017).

    CAS 

    Google Scholar
     

  • 65.

    Liu, F. et al. Animo: sharing biosignals on a smartwatch for lightweight social connection. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 3, 1–19 (2019).

    CAS 

    Google Scholar
     

  • 66.

    Liu, F., Kaufman, G. & Dabbish, L. The effect of expressive biosignals on empathy and closeness for a stigmatized group member. Proc. ACM on Human-Computer Interact. 3, 1–17 (2019).

    CAS 

    Google Scholar
     

  • 67.

    Kim, S. South korea’s refugee debate eclipses a deeper, more fundamental question. The Hill, https://thehill.com/opinion/international/395977-south-koreas-refugee-debate-eclipses-a-deeper-more-fundamental-question (2018).

  • 68.

    Kang, J.-k. Yemeni refugees become a major issue on jeju. Korea JoongAng Daily, http://koreajoongangdaily.joins.com/news/article/article.aspx?aid=3049562 (2018).

  • 69.

    Park, N. South korea is going crazy over a handful of refugees. Foreign Policy, https://foreignpolicy.com/2018/08/06/south-korea-is-going-crazy-over-a-handful-of-refugees/ (2018).

  • 70.

    Seo, B. In south korea, opposition to yemeni refugees is a cry for help. CNN, https://edition.cnn.com/2018/09/13/opinions/south-korea-jeju-yemenis-intl/index.html (2018).

  • 71.

    Diers, K., Weber, F., Brocke, B., Strobel, A. & Schönfeld, S. Instructions matter: a comparison of baseline conditions for cognitive emotion regulation paradigms. Front. Psychol. 5, 347 (2014).

    PubMed 
    PubMed Central 

    Google Scholar
     

  • 72.

    Gross, J. J. & Levenson, R. W. Emotion elicitation using films. Cogn. Emot. 9, 87–108 (1995).


    Google Scholar
     

  • 73.

    Kemper, S. & Sumner, A. The structure of verbal abilities in young and older adults. Psychol. Aging 16, 312 (2001).

    CAS 
    PubMed 

    Google Scholar
     

  • 74.

    Yuan, J., Liberman, M. & Cieri, C. Towards an integrated understanding of speaking rate in conversation. In Ninth International Conference on Spoken Language Processing (2006).

  • 75.

    Gabig, C. S. Mean length of utterance (mlu). Encycl. Autism Spectr. Disord. 1813–1814 (2013).

  • 76.

    Graesser, A. & Chipman, P. Detection of emotions during learning with autotutor. In Proceedings of the 28th Annual Meetings of the Cognitive Science Society, 285–290 (Erlbaum, 2006).

  • 77.

    Afzal, S. & Robinson, P. Natural affect data – collection annotation in a learning context. In 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, 1–7 (IEEE, 2009).

  • 78.

    D’Mello, S. K., Lehman, B. & Person, N. Monitoring affect states during effortful problem solving activities. Int. J. Artif. Intell. Educ. 20, 361–389 (2010).


    Google Scholar
     

  • 79.

    D’Mello, S. K. On the influence of an iterative affect annotation approach on inter-observer and self-observer reliability. IEEE Transactions on Affect. Comput. 7, 136–149 (2015).


    Google Scholar
     

  • 80.

    Levine, L. J. & Safer, M. A. Sources of bias in memory for emotions. Curr. Dir. Psychol. Sci. 11, 169–173 (2002).


    Google Scholar
     

  • 81.

    Safer, M. A., Levine, L. J. & Drapalski, A. L. Distortion in memory for emotions: the contributions of personality and post-event knowledge. Pers. Soc. Psychol. Bull. 28, 1495–1507 (2002).


    Google Scholar
     

  • 82.

    Lench, H. C. & Levine, L. J. Motivational biases in memory for emotions. Cogn. Emot. 24, 401–418 (2010).


    Google Scholar
     

  • 83.

    Park, C. Y. et al. K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations. Zenodo https://doi.org/10.5281/zenodo.3931963 (2020).

  • 84.

    Calix, R. A., Mallepudi, S. A., Chen, B. & Knapp, G. M. Emotion recognition in text for 3-d facial expression rendering. IEEE Transactions on Multimed. 12, 544–551 (2010).


    Google Scholar
     

  • 85.

    Wang, W., Chen, L., Thirunarayan, K. & Sheth, A. P. Harnessing twitter “big data” for automatic emotion identification. In 2012 International Conference on Privacy, Security, Risk and Trust and 2012 International Conferenece on Social Computing, 587–592 (IEEE, 2012).

  • 86.

    Xu, R. et al. Word embedding composition for data imbalances in sentiment and emotion classification. Cogn. Comput. 7, 226–240 (2015).


    Google Scholar
     

  • 87.

    Krippendorff, K. Computing krippendorff’s alpha-reliability. Retrieved from, https://repository.upenn.edu/asc_papers/43 (2011).

  • 88.

    Lee, U. et al. Intelligent positive computing with mobile, wearable, and iot devices: literature review and research directions. Ad Hoc Networks 83, 8–24 (2019).


    Google Scholar
     

  • 89.

    Picard, R. W. Future affective technology for autism and emotion communication. Philos. Transactions Royal Soc. B: Biol. Sci. 364, 3575–3584 (2009).


    Google Scholar
     

  • 90.

    Washington, P. et al. Superpowerglass: a wearable aid for the at-home therapy of children with autism. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 1, 1–22 (2017).


    Google Scholar
     

  • 91.

    Buimer, H. P. et al. Conveying facial expressions to blind and visually impaired persons through a wearable vibrotactile device. Plos One 13 (2018).

  • 92.

    Cha, N. et al. “Hello there! is now a good time to talk?”: understanding opportune moments for proactive conversational interaction with smart speakers. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 4 (2020).

  • 93.

    Kim, A., Park, J.-M. & Lee, U. Interruptibility for in-vehicle multitasking: influence of voice task demands and adaptive behaviors. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 4, 1–22 (2020).


    Google Scholar
     

  • 94.

    Warnock-Parkes, E. et al. Seeing is believing: using video feedback in cognitive therapy for social anxiety disorder. Cogn. Behav. Pract. 24, 245–255 (2017).

    PubMed 
    PubMed Central 

    Google Scholar
     

  • 95.

    Breazeal, C. Emotion and sociable humanoid robots. Int. J. Human-Computer Stud. 59, 119–155 (2003).


    Google Scholar
     

  • 96.

    Kwon, D.-S. et al. Emotion interaction system for a service robot. In RO-MAN 2007 – The 16th IEEE International Symposium on Robot and Human Interactive Communication, 351–356 (IEEE, 2007).

  • 97.

    Nass, C. et al. Improving automotive safety by pairing driver emotion and car voice emotion. In CHI ’05 Extended Abstracts on Human Factors in Computing Systems, 1973–1976 (2005).

  • 98.

    Eyben, F. et al. Emotion on the road—necessity, acceptance, and feasibility of affective computing in the car. Adv. Human-Computer Interact. 2010 (2010).

  • 99.

    Craig, A. D. How do you feel? interoception: the sense of the physiological condition of the body. Nat. Rev. Neurosci. 3, 655–666 (2002).

    CAS 
    PubMed 

    Google Scholar
     

  • 100.

    Markova, V., Ganchev, T. & Kalinkov, K. Clas: a database for cognitive load, affect and stress recognition. In 2019 International Conference on Biomedical Innovations and Applications (BIA), 1–4 (IEEE, 2019).

  • 101.

    Russell, J. A. A circumplex model of affect. J. Pers. Soc. Psychol. 39, 1161 (1980).


    Google Scholar
     

  • 102.

    Plarre, K. et al. Continuous inference of psychological stress from sensory measurements collected in the natural environment. In Proceedings of the 10th ACM/IEEE International Conference on Information Processing in Sensor Networks, 97–108 (IEEE, 2011).

  • 103.

    Ocumpaugh, J. Baker rodrigo ocumpaugh monitoring protocol (bromp) 2.0 technical and training manual. New York, NY Manila, Philipp. Teach. Coll. Columbia Univ. Ateneo Lab. for Learn. Sci. 60 (2015).



  • Source link

    Leave a Reply

    Your email address will not be published. Required fields are marked *