Blogs

International Perspectives: The datafied lives of children and young people

07 Nov, 2022

In Autumn 2021, we interviewed a family of four who live in a suburban area in a semi-detached home in Lombardia, Northern Italy. The kids, Francesco and Stefano, were aged 6 and 4 when Italy was under the first, and most severe, lockdown in March and April 2020. Since Francesco was already in primary school, the family bought their first domestic internet connection, a laptop and a tablet, to let him participate in remote learning activities. Since then, the two boys, now 8 and 6 respectively, have turned into skilled and passionate gamers: the basement hosts two second-hand Wii consoles, and they play Fortnite simultaneously whenever they can enjoy some time off school or sports activities. Moreover, they access Roblox and other free gaming apps on their father’s old phone, and the school encourages them to exercise on Google Classroom. Francesco’s and Stefano’s everyday life is not dissimilar to that of their peers. Yet their experiences introduce a number of issues we would like to discuss in this post:

  • The pervasiveness of datafication, and its tremendous expansion during the COVID-19 pandemic;
  • The tangible outcomes of data, algorithms and automated decision making on children’s and families’ lives;
  • The cultural premises and consequences of datafication.

The pervasive datafication of children’s lives

Francesco’s and Stefano’s experience first reminds us of the increasing datafication of education. Different educational institutions, from early years to higher education alike, have become accustomed to making use of data-driven educational technologies in a variety of ways— for letting worried parents monitor their child’s daily routines in nurseries, for personalising the curriculum around every student’s personal needs, for predicting outcomes and preventing risks, for providing insights into the processes of learning and moulding students’ emotions to maximise learning. In some kindergartens, AI powered humanoid robots are working alongside human teachers – telling folk tales and asking follow-up questions from pre-schoolers to test their listening comprehension. In fact, the usage of various Edtech tools and apps has become so pervasive in early childhood settings that scholars have accused the early years sector of being “obsessed with data“. The COVID-19 pandemic, with repeated lockdowns and other social distancing measures, has undoubtedly expanded and accelerated the students’ data-drain, with privacy principles being suspended or “coronawashed“.

Remote learning meant that almost 1.6 billion children all over the world were coerced into using a variety of data-intensive educational online platforms (UNESCO, 2020). In fact, according to a study carried out by Human Rights Watch, nearly 90 percent of the educational apps and platforms (N= 194) that were used for learning during the COVID-19 pandemic harvested children’s data and shared it with marketers and data brokers, turning children’s data into a highly profitable commodity.

Many families also had to buy new devices, including tablets and computers, to allow every household member to learn or work from home. Although in some countries, for example in the US and in Singapore, many schools provided the students with devices they could use for home-based learning – such help oftentimes came with a price due to student activity monitoring software that was installed in these school-issued devices. Such software (e.g. Gaggle) is mainly used to “protect student safety and ensure their well-being” (Jones & Najera Mendoza, 2021:3). However, there is insufficient evidence to suggest that social media surveillance or content monitoring on students’ accounts is an effective strategy to address schools’ growing concerns about cyberbullying, students’ mental health (including indicators of self-harm, depression), or school violence, including detecting risky behaviors. Such monitoring solutions are not only applied to scanning students’ online searches, monitoring the content of their emails, and other documents as well as their online communications, but also provide schools with a range of monitoring and control functionalities. This includes blocking inappropriate material, tracking logins to school-related applications, viewing students’ screens in real-time, closing web browser tabs, and taking direct control of the input functionality, thus creating a variety of concerns related to children’s rights, privacy, and free speech without any real options for the students to opt out from such surveillance.

Thus, during the pandemic, our already media-rich homes became filled with even more technologies that enable the extraction of data from the flow of everyday life practices and identities, and their conversion into profitable resources for platforms. The growing market of domestic IoTs (Internet of Things) devices – including smart speakers, internet-connected toys, home hubs and many smart appliances – expands the range and detail of “data shared in the home”. These devices also support multiple mediatized parenting practices, such as sharenting (sharing one’s parental joys and challenges) and documenting children’s lives online. It includes “caring dataveillance” – the use of wearable devices, and other technological gadgets and apps, to make parenting practices easier. This includes keeping babies healthy, or when children grow up, it provides parental controls to keep children safe online and/or uses tracking technologies to keep an eye on children while being physically distant.

As much as with parenthood and parenting practices, children’s and young people’s leisure time, self-expression and sociality have also become more and more digitalised, and therefore dependent upon a number the commercially-driven business logic of data capitalism. Spontaneous practices like play and informal interactions with peers are now coded, standardised and manipulated by algorithms.

In sum, children today are conceived and raised in a digital world where almost all aspects of our lives, on- and offline, are increasingly monitored, analysed, and transformed into quantifiable data. With what consequences?

In our own book, we have documented ways in which algorithmic classifications can exclude children from access to certain types of education, or health insurance and loans, with the effect of exacerbating inequalities and fully pre-determining children’s experiences and life course. And with little or no control on the ways in which algorithms represent us and make decisions on our behalf. Algorithms are indeed biased – either because they are based on biased historical data or have biases built in their own design – opaque and unexplainable. So, why do we trust data, algorithms and automation? These are key challenges as we continue to investigate the role of digital technologies in children’s lives.


Get new blog posts straight to your inbox

Join our mailing list

About the author/s

Associate Professor Giovanna Mascheroni is a prominent international expert in the field of the sociology of media, with specialisation in children and digital media, online opportunities and risks, datafication of children’s and families’ everyday lives. Giovanna’s work focusses on the social ... more
Andra Siibak is a Professor of Media Studies and Program Director of the Media and Communication doctoral program at the Institute of Social Studies, University of Tartu, Estonia. Her main field of research has to do with the opportunities and risks surrounding internet use, datafication of childhoo ... more

Recent Blogs



View all blogs

The Australian Research Council Centre of Excellence for the Digital Child acknowledges the First Australian owners of the lands on where we gather and pay our respects to the Elders, lores, customs and creation spirits of this country.

The Centre recognises that the examples we set in diversity and inclusion will support young children to respect and celebrate differences in all people. We embed diversity, inclusivity and equality into all aspects of the Centre’s activities and welcome all people regardless of race, ethnicity, social background, religion, gender, age, disability, sexual orientation and national origin.