We are pleased to announce that Dr Dimitra Petrakaki, from the University of Sussex Business School, will join us to present a talk titled “Moderation Work in a Digital Health Platform”.
This is an Zoom webinar, please register your interest to attend at: CPC.email@example.com
One of the most significant forms of labour that have emerged in the platform economy is moderation work. Although largely under-researched in organization and STS studies, moderation constitutes an important component of platforms in as long as it determines the content of the information they produce and set the parameters of what is acceptable and what is not. Compared to other types of platform work where work is transacted electronically and delivered physically (e.g. Deliveroo, Uber) moderation work is both transacted and delivered digitally through the platform (e.g. M-Turk) (Berg et al., 2018; Gandini, 2019; Howcroft & Bergvall-Kåreborn, 2018; Wood et al., 2019); in fact it is embedded in the platform itself. Moderation implies an intervention in the content that is to be shared in a platform; it may involve editing; re-writing or deleting a part or the whole of its content (Gillespie, 2018). Facebook for instance moderates 300 million photos daily (West, 2017). The way in which content gets transformed is incorporated in moderation policy. Although moderation policies and principles are somehow unique to each platform and remain largely unknown to those external to it (West, 2017), there are some generic principles that govern its function including for example the prohibition of sexual, violent, racist and sexist behavior or content, harassment, hate speech, self-harm; illegal activity and others (Gillespie, 2018). The explicit way in which content moderation may occur renders the labour it involves as low skill; a form of information micro-tasking like for example voice transcription and photo editing (Berg et al., 2018).
This paper argues however that far from involving a series of routinised repetitive tasks, moderation work entails the continuous exertion of discretion and the invocation of emotions both of which occur in an invisible fashion. The study shows how moderators are asked to review and moderate information that displays ambiguity, given the lack of mechanisms to confirm the identity of the producer and the validity of the information that is communicated. Under these circumstances moderators exercise discretion within a context of information and power asymmetries. This becomes even more significant in the context of health platforms where the content that is produced involves the generation and exchange of health data and experiences. The paper aims to discuss in a more critical way the implications that these challenges have for how we understand moderation as an invisible form of emotional work and what role moderation work plays in curating patient experience.
The paper presents findings from a qualitative study conducted in a health feedback platform in the UK.
Keywords: moderation work, health platforms, feedback, experience economy.