Human Systems Interaction (HSI) is an evolution of the study of Human Computer Interaction (HCI), which has primarily focused on the interface between “the computer” and a person. HSI builds on that foundation to focus on how computational systems themselves mediate interaction between people and other systems, and how we therefore can design systems to support interaction between people and other systems – including between other people. This focus on computational, systems as mediators highlights the increasingly pervasive role of interaction design as mediator of cultural, educational, health and social experience.
While these themes have a history within HCI, HSI foregrounds design and engineering from that perspective. For example, prior to the Internet of Things, interaction with computational devices has been clear and explicit, from using a keypad on a bank machine to moving a finger on a tablet screen. Increasingly, interactions with systems are done by proxy and without our awareness. How we interact with one system, like moving across the web, invisibly affects other systems like what search results and ads or social media content we see. HSI asks how we can design computational systems to clearly and explicitly support interaction between a person and other systems – including other people, communities and infrastructure.
The WellthLab, founded and directed by Professor M.C. Schraefel, embodies the HSI approach in Electronics and Computer Science (ECS) at Southampton. The vision of the lab is to help “make normal better” for all. Its mission is to explore how computational systems can be designed and engineered to support that vision.
Some of the principles backing the quest of the WellthLab are to bring scepticism to considerations of how helpful computation can be. For example, where some approaches seek to capture as much data as possible, believing more data more of the time is always better, the WellthLab projects start from the question: what is “the minimal data dose” to support an interaction? This question has informed projects like Meaningful Consent in the Digital Economy that focussed on “apparency” to help make invisible data transactions visible, and where that was not possible or scalable, to craft agents that take one’s data preferences and act on one’s behalf.