This paper examines how 1970s feminist demands around social reproduction are revisited in the contemporary promotional mediascape. It uses the brand ‘Fairplay’ as a case study; Fairplay is a book, podcast, website, social media presence, documentary, deck of cards, facilitator training – owned by the microcelebrity Eve Rodsky. I contextualise this with the resurgence of contemporary media addressing the gendered division of labour in the home, such as books like Equal Partners by Kate Mangino.

This paper looks at these mediated debates alongside household task apps, in particular the software applications Tody and Done. I argue that these digital products and branded materials are driven by ‘technosolutionism’ (Morozov) where ’Silicon Valley sets time’ (Wajcman 2018). They offer the solution to the circulating social anxieties around burnout, life hacks and achieving work-life balance – all of which are classed, gendered and racialised.

Methodologically I approach these materials in two ways. One is through the lens of branding, asking how questions of social reproduction are marketed. I also use ‘the walk-through method’ (Light et al 2018), which involves exploring the apps’ vision, operating model, Terms and Conditions, Privacy Policy, systems around data collection and/or subscription.

This paper investigates what gendered, classed and racialised imaginings of the household are baked into their design. And to what extent are these reproducing traditional hierarchies of the home? How are second wave campaigns around social reproduction revisited, reimagined and branded?

Datafication of family life makes “the problematic relation between the home and the outside” (to use David Morley’s words) complex, multi-faceted and contradictory as never before. My paper seeks to address these challenges by focusing on mediatization of households by empirical investigation (IDIs and FGIs with “family members”) of mundane media practices, shared notions of space, and internal and external household social relations. In this regard, media are embedded in domestic routines and may foster new ones: boundaries of mediated domesticities differ for each family member, while being unstable, incoherent, and sometimes even contradictory.
The goal of the paper is thus to learn how family members negotiate home privacies by media-oriented practices. In particular, I seek to reconstruct both symbolic and material acts of privacy negotiations that altogether establish (and sometimes disrupt) boundaries between home and the outside world. Thus, I analyze family privacies as (a) always complex, context-bound and constantly reconsidered, (b) shaped by imaginaries (understandings, expectations, and evaluations) and practices that mutually reinforce each other, and (c) subjects of ongoing negotiations.
In particular, I discuss:
– horizontal acts of privacy violations performed by colleagues, friends and, not least, family members;
– changing limits of the private – being challenged, flexible, prone to violation, and subject of mutual negotiations;
– platformization of education/work environments escalating power-related tensions affecting family members: class-related pressures (shortage of resources including hardware, space & time), growing excess of power within particular relations during the pandemic (distant learning, tracking and desktime apps, parental/supervisor control software);
– new norms as response to platformization: by drawing upon the Raymond Williams idea of culture as a whole way of life, the concept of self-exposure as a whole way of life (when mediated visibility is ongoingly negotiated and compromised) is introduced and discussed how it affects family life.

Risky sharenting occurs when parents and guardians regularly share sensitive and identifying information about children on social media platforms. The practice fuels the datafication of children’s lives, exposing them to risks of cyberharms whilst potentially contaminating their online and digital identities.
This paper unravels the infrastructural and structural barriers impeding ongoing efforts to disrupt legal but harmful digital cultures of parenting of which risky sharenting represents an example. To achieve its objectives, the paper draws on insights from zemiology (the study of social harms) to analyse the policies instituted by social media platforms and the data from a digital passive ethnography of a Facebook group of parents practising sharenting.
With insights from the documentary analysis and ethnography which form part of an interdisciplinary study of sharenting funded by the Economics and Social Research Council (ESRC), the paper reveals that whilst the infrastructural barriers to harm prevention are posed by the design logics and rationalities of social media platforms, structural obstacles stem from regulatory gaps in contemporary AI governance. Together, these empower and enable the designers of the main social media platforms to embed in their technologies, visible and invisible affordances capable of inviting and facilitating harmful forms of use.
This paper draws on the findings of the discourse analysis and digital ethnography to develop a remedial framework that outlines the harm causation process enabled by regulatory gaps and technology affordances, and the points at which preventative policies should be introduced to disrupt the process. Through its analysis of the nexus of regulatory gaps, technology affordances, and harms, the paper advances the interdisciplinary scholarship on AI ethics and governance. More specifically, by providing the empirical example of risky sharenting, the paper expands the nascent literature on the harms of emerging cultures of digital parenting facilitated by AI technologies.

The ConnecteDNA research project explores the impact of direct-to-consumer genetic testing (DTCGT) on gamete (egg and sperm) donor conception. One of the implications of the increased popularity of DTCGT is that donors, donor-conceived people and parents through donor conception can share their (or their child’s) DNA data on DTCGT databases and, using the ‘matching’ function these sites offer, in combination with social media platforms and ‘official’ sources of information, sometimes very easily, and sometimes completely unexpectedly, identify unknown genetic relatives.

Drawing on semi-structured interviews and focus group discussions with donors, donor-conceived people and parents through donor conception, we explore the power of the DTCGT companies, in combination with social media platforms, over bodily material re-incarnated into the internet. In that environment, DNA data is no longer an embodied blueprint, unknowable until it plays out in the space-time of someone’s life. Rather, DNA information in electronic form has, for donors, donor-conceived people, and their families, the power to interrupt, to radically (re)shape, or transform families. DTCGT, often marketed as harmless fun, is sometimes just that. However, DNA shared through DTCGT sites has relational consequences that can also shock, traumatise and cause deep rifts within family landscapes. Our research explores (whether and) how donors, donor-conceived people, and their families make sense of life after finding or uncovering information from DTCGT, and how they think the regulatory environment needs to change to offer protection for future families through donor conception.

Family intervention is a long-established mechanism of state control, but recent technological developments are facilitating new regulatory capacities and objectives. This paper will explore how contemporary policy interventions in the UK are converging around a technological solutionist ideology that centres family relationships as core instruments of social management. The last decade has seen a marked techno-administrative turn, with family state relationships increasingly mediated through online portals and dashboards. Over the last few years this data centric model has accelerated towards an algorithmic approach to governance through the incorporation of big data surveillance, predictive analytics and behavioural interventions to monitor and regulate populations. We trace the embedding of data collection frameworks into apparently conventional family intervention programs and show how this ‘datification’ was made into a core delivery tool. We also highlight how secrecy, or at the very least strategic silence, has restricted public knowledge of how and why data is being collected and used in the UK. We show how parents and children are being quantified and translated into data points to support new logics of choice manipulation, ceding unprecedented power to financiers, data analytic companies, platform developers and big tech companies. We argue that public and private data extraction and its furthering of behaviourist agendas have serious implications for families and as such deserve critical scrutiny.

test test test test test test test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test testtest test test test test

Families currently use a range of technologies to locate, track, and inform each other of their physical location and activities. These include GPS-enabled devices and dedicated location-based software applications such as Life360. To date, Whilst research has focused on perceptions and uses of these tracking technologies within private family contexts. To date, however, there is no research into how these technologies are received in the wider public imagination. This paper contributes to knowledge about family location tracking technologies by investigating public representation and debate around their uses, meanings, and impacts. The study offers a topic-based and thematic content analysis of public conversations about Life360 and family tracking apps on three key social media platforms – Twitter, YouTube and TikTok. The study offers both a platform-specific and cross-platform analysis to understand how these technologies are publicly perceived and contested. The themes identified across the three platforms align with their varied cultures of use and platform vernaculars, with Twitter emphasizing newsworthy topics and events, YouTube focusing on commercial product reviews and tutorials, and TikTok posts using humor and memes to express everyday experiences and political expressions. Finally, the cross-platform analysis highlights the power of an antagonist and ambivalent platform vernacular found within the younger user community on TikTok to influence wider public topics of discussion across other social and mainstream media.

Families are increasingly using apps and devices that provide detailed information about the location and activities of children and other family members. While typically performed for benevolent reasons such as maintaining child safety, tracking technologies like Life360 and Find My Phone raise concerns about snooping and surveillance. This paper examines parental behaviours and attitudes towards this controversial practice via an online survey which collected responses from Australian parents of children aged 5-18. A significant number of parents reported using tracking tools. Parents’ views about the practice were sometimes ambivalent and in disagreement. Perspectives variously included: defending geo-tracking as conducive to child wellbeing and family management and logistics, attacking the language of surveillance used to describe it, and opposing the use of these technologies as antithetical to child independence and choice. After exploring such themes, the paper builds on literature associated with child and family location tracking by identifying and critically discussing the socio-ethical issues of changing family norms associated with powerful child monitoring technology, child autonomy and consent, and the normalisation of geo-tracking and surveillance. The discussion employs Helen Nissenbaum’s concept of contextual integrity to evaluate family and child privacy and to illuminate the socio-ethical complexity of this evolving technological practice.

This paper presents one chapter from my PhD thesis, which uses feminist and queer approaches to consider the human rights impact of the collection and sharing of data in children’s services in England. My thesis draws on critical data studies to examine how the collection and use of data interact with systems of power: they shape who can know what about the world, and to what uses this knowledge can be put. This chapter examines one specific case study in existing programming in children’s social care: the ‘Troubled Families Programme.’ This programme, as I show, has as a key objective the increasing use of data by the local authorities. I will argue that the concept of ‘family’ in this data does not correspond with how the concept is defined in law, policy or practice.

I situate the collection and sharing of data within the history of information-gathering and decision-making in children’s services and with the political choices which have shaped service delivery and datafication. Classification and categorisation are used to define the ‘family’ as a unit of analysis, which enables the identification of the ‘problem family,’ and further its definition as implicitly outside of the norm. Through examining the ways in which data systems classify, categorise and stereotype individuals who are known to social services, I show how the expectation that individual and family lives are legible to computers is used to normalise certain forms of families, and stereotype those who do not comply as ‘troubled.’ I argue that the use of data in this programme encourages and naturalises simplistic, Aristotelian classification: both to categorise people into families, and in order to classify families into ‘troubled’ and (implicitly) ‘normal.’

Data collection and sharing is portrayed as actively beneficial for child welfare provision in the UK: however, in this paper, I argue that it promotes a simplistic view of what makes a good family. In place of families that work together, and state support that works to support them, the ‘Troubled Families Programme’ and its associated datafication project support an antiquated idea of what makes a good family, and promote work as the solution to all ills.

The paper discusses and explains the reasons that disparities and discrepancies prevail between Global North and Global South with regard to datafied family and society in general. The main question is to identify why the Global South is reluctant, dubitative even hermetically closed to reveal family secrets and facts.
Algeria as part of the South is a case study in this paper. Thus, in this country, characteristics of opacity and conservatism values and principles appear to prevail without challenges. Also, fields of culture, education, ethics, religion and politics constitute the main obstacles and hurdles to build up a datafied family that lead to family dynamics. More than that, social inequality, gap trust, injustice, lack of transparency , lack of law and order, freedom of the press contributed to adopt secretive and discreet attitudes.
It is assumed that the democratisation of the society is intimately linked to the level of political awareness, openness, commitment, civil society, citizenship engagement and in this case, Algeria is still striving to achieve such democratic goals, values and practices.
Yet, the conception and perception and implementation of datafied society are far from being well explained and understood. Algeria has inherited a socialist regime with unique party system and one way of thinking and subsequently rejecting opposite views. Citizens are trying very hard to catch up with a new political, economic and cultural environment based on principles on plurality and diversity of opinions and ideas.
So, if my paper is accepted I will provide some answers on the problematic difficulties and constraints of setting up datafied family in the Global South and Algeria in particular.