Self-reports are the most accurate form of assessing mood. They, can be
administered frequently, and self-report tools are valuable for quantifying and
monitoring one’s mental state of well-being. Traditionally, self-reports are provided using numerical or graphical scales, however, those are known to be prone to systematic errors in their measurements. Alternatively, facial expressions are intrinsically connected to emotional experiences, are a tool for us to communicate our emotions. We are well-versed in enacting or recognizing facial expressions. Hence, those are suitable representations for mood. Tools relying on facial expressions can expand the space for mood self-report technologies.
Depression is an affective disorder, particularly pervasive in contemporary
society. Its severity is typically measured on individual symptoms using screener
questionnaires. However, when administered frequently, the assessment quality of those questionnaires is known to degrade significantly. Hence, by identifying salient features indicative of depression’s symptomatology in the face, facial expressionbased tools can capitalise on the strengths of self-reports and be used for assessing or monitoring depression’s severity.
Herein, this thesis explores the design and implementation of four prototypes for mood self-reports iteratively. Three empirical studies evaluate the use of the method within three experimental contexts, by using text and images to elicit emotions in-situ and for monitoring mood in the wild. Therein, the method was evaluated quantitatively – by contrasting self-reports to those provided with the well-known visual analogue scale, and qualitatively – by identifying aspects of importance for facial expression-based tools and exploring user’s preferences. Thereafter, an exploratory study was conducted identifying, and visualizing facial features indicative of symptoms of depression as a step towards creating disorder-specific self-report instruments. Finally, EmotionAlly, a prototype for contextualized assessment, tracking, and visualisation of mood using computer-generated facial expressions was developed, integrating findings from preceding quantitative and qualitative evaluations.