Papers

Please see below the range of papers we accepted to our Scent InContext DIS 2023 Workshop, some have been displayed as Abstract only at the participants request.

Abstract: Multisensory experiences can reach more diverse audiences. With inclusivity emerging as an important value both in industry and in education, there is a need to explore alternate types of interactions. Olfactory technology can introduce new design possibilities for inclusive social experiences beyond the prevalent focus of visual- or audio-based systems. Children with sensory disabilities (e.g., visual or hearing impairments) may experience exclusion from activities that rely on a single sense. This short paper supports the goal of designing ways to incorporate olfactory technology into social experiences to promote inclusion and engagement between children with different abilities.

Abstract: Olfactory-based interactions (OBI) are steadily increasing due to ad- vanced olfactory displays developed by established researchers and commercial companies. However, there is a lack of literature that investigates both qualitative and quantitative understandings of users’ conscious olfactory abilities, perceptions, and reactions. This preliminary work presents steps toward exploring a low-fidelity scent medium for OBIs that evaluates users’ ability to identify and discriminate various synthetic scents via scratch-and-sniff stickers. Paper olfactory displays utilized during OBI investigations can de- liver quality scented experiences while increasing users’ confidence in their smelling capabilities and memory recall. We conducted individual usability studies with undergraduate and graduate stu- dents (N=40). Observations from this study suggest that synthetic scents that imitate natural odors created an affective impression on users. Insights from this experimental design reveal the exigency of encouraging researchers to implement the use of a simple scent mediums to explore participants’ olfactory abilities and perceptions while also ensuring quality experiences.

Over the years I have intermittently talked and written about nasal interfaces. This was in part in homage to the classic UIST spoof paper on the nose mouse [5], but more seriously looking at the way the specific spatio-temporal characteristics of smell can be used a a metaphor for other forms of (non-smell-based) interaction. In this position paper I’ll revisit these arguments as a way of exploring the way that physical and neurological aspects of smell may influence the fundamental way we experience the world olfactorily, and hence how we design for olfactory experience.

Opportunities and Challenges for Olfactory Devices in Dementia: Lessons Learned from Sound-Based Technologies

Maarten Houben

Abstract: Research in HCI has demonstrated how digital media can evoke memories, stories, and emotional responses from people with de- mentia. While there is an extensive body of research on visual and auditory stimuli, the role of scent or olfactory devices in dementia has received limited research attention. In this position paper, I present several opportunities and challenges on the potential of olfactory devices to support people living with dementia. These challenges and opportunities are grounded in insights from the ‘Everyday Sounds of Dementia’ project, which investigated the role of everyday sounds in technologies to support the quality of life of people with dementia. Based on the lessons learned from this project, this position paper concludes with future opportunities and challenges for scent in the context of dementia care.

An At-Home Digital Solution for Smell Loss

Sanjoli Mathur, Carl Philpott, and Marianna Obrist

Abstract: Studies have shown that losing our sense of smell can adversely affect our health, well-being, and quality of life [3, 4, 5]. There is evidence that Olfactory Training can help recover smell function [1, 6]. This paper provides an overview of the Smell Care project, which is a 6-month feasibility study, testing a digital solution technology probe for Olfactory Training. The paper focuses on the intersection of this study with architecture and built environment. It describes how this study will assess the integration of the digital solution into people’s homes. It also explores future pathways of the study, specifically: 1. how the device fits into smart homes and 2. the diversification of user groups for the technology probe to non-UK participants, and the cultural complexities that that will bring [2].

How computers can shape the relationship between non-human primates and humans through multi-sensory experiences: Developing multi-modal devices for Lemurs

Ilyena Hirskyj-Douglas, Stephen Brewster, Vilma Kankaanpaa and Jiaqi Wang

Abstract: The human-animal relationship has been essential to human evolution and development, provid- ing mental health benefits, contributing to the sustainable development of the environ- ment, and helping to maintain species balance. Unfortunately, human-centered behavior has also caused significant harm to many animals through hunting, trading, environmental destruc- tion, and captivity. It is critical that we address these issues and work towards rebuilding this relationship, not just for the benefit of humans, but for the animals as well. Zoos are a popular place for people to interact with animals. However, research has shown that many animals, particularly primates, can become stressed by the shouting and feeding behavior of visitors. This can harm the animals’ well-being and affect their natural behaviors. In view of this situation, this research will look at how computers can support the primate-visitor interactions in ways that supports both the visitor and animal appropriately. This will be done with black and white. Black and White Lemurs were chosen due to their diverse sensory types, remarkable social behavior, and endangered Species. The thesis will be structured around the following research studies:

Study 1: What attracts Lemurs to multi-modal computers and what do these look like?

Study 2: How to attract human visitors to the zoo to interact with lemurs via a computer?