The LCFI website uses cookies only for anonymised website statistics and for ensuring our security, never for tracking or identifying you individually. To find out more, and to find out how we protect your personal information, please read our privacy policy.

Designing for Wellbeing

Growing concern over the impact of digital technologies on psychological wellbeing has prompted the largest technology companies to develop initiatives on ‘digital wellbeing’. However, these initiatives tend to focus on encouraging people to change their technology behaviour rather than on changing technology itself. Yet, it's unclear why users should bear the burden of adjusting to, or self-regulating against, designs that aren’t aligned with their psychological needs.

Many designers are motivated to improve the psychological impact of technologies but there is little rigorous guidance on how to do so in practice. There is a gap in evidence-based strategies that can be employed to integrate wellbeing psychology into design practice. Our work, in collaboration with colleagues at the Wellbeing Technologies Lab, Imperial College London and at the Affective Interactions Lab, the University of Sydney, conducts research to address this gap and explores the following research questions:

  • How can wellbeing psychology be rigorously and systematically applied to the technology design process to improve the extent to which technologies support wellbeing?
  • How could this knowledge be effectively transferred to practitioners to help them support psychological wellbeing through design, keeping in mind the need for both rigor and practicality?
  • What measures can be developed for comparative and ongoing evaluations of technology experience with respect to wellbeing.
  • What evidence-based design guidelines can be derived from existing literature to help designers create more ‘wellbeing supportive’ technologies?

Resources

  • Wellbeing Supportive Design Toolkit (website)
  • Peters, D., Calvo R.A., & Ryan R.M. (2018) “Designing for Motivation, Engagement and Wellbeing in Digital Experience.” Frontiers in Psychology 9, no. MAY.
  • Calvo, R. A., & Peters, D. (2019). Design for Wellbeing - Tools for Research, Practice and Ethics. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems - CHI EA ’19 (pp. 1–5). New York, New York, USA: ACM Press.
  • Peters, D., Vold, K., Robinson, D., Calvo, R. A., & Member, S. (2020). Responsible AI-Two Frameworks for Ethical Design Practice. IEEE Transactions on Technology and Society, 1(1).
  • Calvo, R. A., Peters, D., Vold, K., & Ryan, R. M. (2020) “Supporting Human Autonomy in AI Systems: A Framework for Ethical Enquiry.” In Ethics of Digital Wellbeing: A Multidisciplinary Approach, edited by Christopher Burr and Luciano Floridi. Springer.
  • Peters, D., Loke, L., & Ahmadpour, N. (2020). Toolkits, cards and games – A review of analogue tools for collaborative ideation. CoDesign.
  • Peters D., Ahmadpour, N., & Calvo, R. A. (2020). Tools for Wellbeing-Supportive Design: Features, Characteristics, and Prototypes. Multimodal Technologies and Interaction, 4(3), 40.
  • Peters, D., and Ahmadpour, N. (2020). Digital wellbeing through design: Evaluation of a professional development workshop on wellbeing-supportive design.  32nd Australian Conference on Human-Computer Interaction. ACM.
  • Peters, D., and Calvo, R. A. (2021) “Design for Wellbeing - Methods and Strategies for Supporting Psychological Needs in User Experience.” In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems, 1–3. New York, NY, USA: Association for Computing Machinery, 2021.
  • Peters, D. (in press) “Wellbeing supportive design – 15 research-based heuristics for supporting psychological wellbeing in user experience”. International Journal of Human-Computer Interaction
  • Peters, D., and Calvo, R. A. (in press) “Self-Determination Theory and Technology Design.” In The Oxford Handbook on Self-Determination Theory. Oxford University Press.

Next project

Ethical development methods for conversational agents