How AI is helping us connect in digital spaces
With the COVID-19 pandemic resulting in stringent social restrictions around the world, I was fascinated by how individuals (including myself) adapted to our newfound situation. Living in a new reality where I could not meet up with friends and family, I, like others, turned to social technologies to satiate my need for connection. However, with all other social affordance stripped away, I was left unsatisfied with the current capabilities of online socialising. Online messaging and video-calling did not seem to satisfy my social cravings. At times, it was dry, awkward, and overwhelming. It truly took a pandemic to realise the deficiencies of digital devices in emulating the deep, rich, exciting (and often messy) offline interactions we took for granted in our pre-pandemic life.
Therefore, as the CHI 2021 conference came around, I was excited to see what new research was being undertaken to make the online socialising experience more meaningful. As I scoured the programme, favouriting talks relating to online communication, social media, etc., I noticed that two studies incorporated artificial intelligence (AI) to help make online interactions more affect sensitive through ‘AI-mediated communication’. ‘Affect’ refers to the psychological common-denominator of our emotional lives, underpinning emotions, moods, feelings, etc. (Russell, 2003). So how were researchers leveraging AI to help make our online interactions more affective, and should they be?
The first study by Murali et al. (2021) titled: AffectiveSpotlight: Facilitating the Communication of Affective Responses from Audience Members during Online Presentations, developed and investigated the efficacy of a Microsoft Teams embedded affect-sensitive AI-bot (named AffectiveSpotlight). The study aimed to address the problem of limited audience-presenter interaction during online presentations by implementing AffectiveSpotlight which was able to capture and communicate the affective responses from the audience to the presenter. The bot would analyse emotive responses (valued by presenters) of each audience member in real-time and spotlight the most expressive to the presenter without labelling the emotion; allowing the presenter to interpret it themselves. The study found that usage of AffectiveSpotlight improved the presenter’s experience: making them feel more aware of their audience, speak for longer periods (implying reduced speaker anxiety), and also led to self-rating of their presentation quality that was closer to audience members responses. Whilst these were promising results, participants were solely from the tech sector, limiting the overall generalisability of findings to other social groups who also perform online presentations e.g., teacher-student interactions.
The second study by Liu et al. (2021) titled: Significant Otter: Understanding the Role of Biosignals in Communication, explored the usage of a smartwatch-based AI software (named Significant Otter) that analysed users bio-signals (heart rate) to generate a set of possible emotional states that the user could choose from and send to their romantic partner via animated otters. The qualitative study followed romantic couples for over a month to investigate the role of Significant Otter’s bio-signal sharing capability in communication. Couples reported that the ability to share bio-signals in this manner supported easier, authentic communication and nurtured a greater sense of social connection. These were exciting results; however, the paper did mention how some participants questioned themselves when the presented possible emotional states did not match what they felt internally. Also, other participants would blindly accept Significant Otters suggestions leading them to reflect less on their actual state.
These papers fascinated me; I had never thought that AI could help close the emotional gap between individuals in digital space, and how this helped to facilitate richer online interactions. Research into affect-sensitive AI is integral to the HCI field of affective computing. There have been past calls in this field to frame affect not as discrete units of information to be processed by a computer (affect as information), but as dynamic, socio-culturally embedded outcomes experienced through interaction (affect as interaction) (Boehner et al., 2005). It was interesting to see how the new technologies introduced in CHI 2021 were veering to the latter framing, where participants in Liu et al (2021 and Murali et al. (2021) were left free to interpret the emotions presented by the technologies and ascribe meaning themselves. It is exciting to see current affect-sensitive technologies taking this perspective as from a wider lens, it signals a shift from technologies being representational tools, to being participatory tools. I cannot wait to see what is in store next!