BEGIN:VCALENDAR
PRODID:-//Columba Systems Ltd//NONSGML CPNG/SpringViewer/ICal Output/3.3-
 M3//EN
VERSION:2.0
CALSCALE:GREGORIAN
METHOD:PUBLISH
BEGIN:VEVENT
DTSTAMP:20200521T075137Z
DTSTART:20200623T130000Z
DTEND:20200623T140000Z
SUMMARY:Centre for Primary Care Seminar – Dr Dimitra Petrakaki – ‘Moderat
 ion Work in a Digital Health Platform’ via Zoom
UID:{http://www.columbasystems.com/customers/uom/gpp/eventid/}xu5-kagh859
 1-pa1qwc
DESCRIPTION:\nAbstract\nOne of the most significant forms of labour that 
 have emerged in the platform economy is moderation work. Although largel
 y under-researched in organization and STS studies\, moderation constitu
 tes an important component of platforms in as long as it determines the 
 content of the information they produce and set the parameters of what i
 s acceptable and what is not. Compared to other types of platform work w
 here work is transacted electronically and delivered physically (e.g. De
 liveroo\, Uber) moderation work is both transacted and delivered digital
 ly through the platform (e.g. M-Turk) (Berg et al.\, 2018\; Gandini\, 20
 19\; Howcroft & Bergvall-Kåreborn\, 2018\; Wood et al.\, 2019)\; in fact
  it is embedded in the platform itself. Moderation implies an interventi
 on in the content that is to be shared in a platform\; it may involve ed
 iting\; re-writing or deleting a part or the whole of its content (Gille
 spie\, 2018). Facebook for instance moderates 300 million photos daily (
 West\, 2017). The way in which content gets transformed is incorporated 
 in moderation policy. Although moderation policies and principles are so
 mehow unique to each platform and remain largely unknown to those extern
 al to it (West\, 2017)\, there are some generic principles that govern i
 ts function including for example the prohibition of sexual\, violent\, 
 racist and sexist behavior or content\, harassment\, hate speech\, self-
 harm\; illegal activity and others (Gillespie\, 2018). The explicit way 
 in which content moderation may occur renders the labour it involves as 
 low skill\; a form of information micro-tasking like for example voice t
 ranscription and photo editing (Berg et al.\, 2018). \n\nThis paper argu
 es however that far from involving a series of routinised repetitive tas
 ks\, moderation work entails the continuous exertion of discretion and t
 he invocation of emotions both of which occur in an invisible fashion. T
 he study shows how moderators are asked to review and moderate informati
 on that displays ambiguity\, given the lack of mechanisms to confirm the
  identity of the producer and the validity of the information that is co
 mmunicated.  Under these circumstances moderators exercise discretion wi
 thin a context of information and power asymmetries. This becomes even m
 ore significant in the context of health platforms where the content tha
 t is produced involves the generation and exchange of health data and ex
 periences. The paper aims to discuss in a more critical way the implicat
 ions that these challenges have for how we understand moderation as an i
 nvisible form of emotional work and what role moderation work plays in c
 urating patient experience. \n\nThe paper presents findings from a quali
 tative study conducted in a health feedback platform in the UK.\n\n\nFor
  registration and Zoom login details\, please email: CPC.seminar.series@
 manchester.ac.uk
STATUS:TENTATIVE
TRANSP:TRANSPARENT
CLASS:PUBLIC
END:VEVENT
END:VCALENDAR
