International Perspectives: Valuing autonomy of teacher and child in cloud classrooms
01 Dec, 2022
Digital learning is rapidly becoming reality in classrooms around the globe through use of online platforms. In a recent publication in the Harvard Educational Review coauthored with José van Dijck, we argued that this platformization of education—the integration of digital platforms into daily school practices—forms a major cause of concern worldwide for the autonomy of schools, teachers and young children.
Firstly, global technology giants like Google (Alphabet), Apple, Facebook (Meta), Amazon, and Microsoft (GAFAM)—Big Tech—are rapidly expanding their hardware and digital services into the edtech market and increasingly seizing control over the shaping and organization of online learning environments in schools. Secondly, through increased interweaving of a diverse set of global and local educational technologies—digital learning platforms, learning tracking systems, parental communications apps —in everyday classroom teaching and learning, control over pedagogical decision-making shifts from teachers to commercial digital platforms. This capacity to monitor and process children’s online interactions—often without their consent or their parents— constitutes a serious threat to privacy. To understand how platforms affect fundamental freedoms of teachers and children in the classroom, it is key to further investigate Big Tech shaping—‘Googlization’ or ‘Amazonification’—of public education in relation to the operations of ‘smallscale’, often locally developed, edtech which in the past decade have been firmly anchored in classroom teaching and learning in countries worldwide.
Our research is focused on the platformization of primary school classrooms in the Netherlands. In the Dutch school system, the use of digital technologies is becoming a key part of daily classroom practices. Pupils and teachers make increasing use of Chromebooks, iPads and laptops, which provide the hardware infrastructure for learning and teaching in increasingly complex cloud environments. Single-sign on cloud portals, such as COOL (Cloudwise) or Prowise GO (Prowise), facilitate access to a rich landscape of cloud-based software, including a category of ‘intelligent’ edtech such as learning analytics systems (LeerUniek), internet and safeguarding tools (KlikSafe) and adaptive learning platforms (Snappet, Gynzy, Bingel). These adaptive edtech—used by more than half of all schools in Dutch primary education—employ algorithmic analytics to tailor educational content to a child’s learning needs. Importantly, the algorithmic back-ends of Dutch adaptive learning platforms are increasingly built on machine-learning cloud services provided by Big Tech—often without the knowledge of schools, teachers, children and their parents. Official Amazon Web Services Public Sector Partner Snappet, for example, employs Amazon’s machine learning stack and the back-end of Bingel was (until recently) powered by American adaptive learning technology Knewton, which as well is built on top of AWS. Learning in these corporate yet indistinctly bounded digital spaces of the cloud raises poignant issues about young pupils’ data flowing invisibly from classrooms into the global infrastructures of tech giants.
In the classroom, teachers interact through these platforms’ interfaces, or dashboards, which provide them with real-time insight in student performance and which in the past few years have become pivotal technologies initiating and informing their pedagogical actions for personalized learning. Schools and teachers, however, are insufficiently aware that the design of educational platform technologies—including dashboards— powerfully shapes behaviors.
Edtech are never neutral containers. Dashboards are normatively loaded with ideas, values and purposes which inform normative views of learning steering teachers’ interpretations and actions toward certain pedagogical choices. Teacher dashboards in Dutch adaptive learning technologies, for example, spotlight performance as a true locus of teacher control and manipulation. Real-time graphical displays of student progress, performance relative to target levels and peers, competence level, and growth, present a set of actionable levers teachers can pull to tweak learning, pushing students to shift from ‘red’ to ‘green’, from ‘below average’ to ‘average’. Encoding a model of teaching as performance optimization, these dashboards construct particular views on the complex reality of learning based on particular ideas of what education is and should be—at the detriment of others. Of course, teachers’ capacities to interpret and act on dashboard intel always exceed the framings imposed on them. Nonetheless, to better align their pedagogical decision-making with their and their schools’ educational values (rather than those of digital platforms), increased awareness of edtech’s normative framings is key.
To support teachers in the pedagogically-conscious use of digital systems in the classroom, Dutch public organization for Education & ICT Kennisnet recently published the report ‘Scratching on the Dashboard’ (In Dutch, 2022). Based on Dutch teachers’ experiences with adaptive learning platforms and dashboards, the report discusses how classroom technologies shape teacher’s professional practice. The report underscores that better insight in dashboards pedagogical design can significantly help teachers to put its particular representations of the learning child into perspective and improve teacher’s confidence in making decisions based on their own professional knowledge and insights. That all starts with understanding what a dashboard does and does not show—the reality of learning is always more complex and there is always much more to say about a child than what data will show. Nevertheless, compared to the speed at which digital technology rumbles into the Dutch classroom, ‘Scratching on the Dashboard’ is only a drop in the ocean: schools’ knowledge about the pedagogical impact of edtech is limited and edtech is still dominantly seen as innocent backdrop to pedagogy. The danger of that is that learning platforms will increasingly be dictating pedagogy rather than schools and teachers—platform algorithms and interfaces prescribe and prescript what ‘good education’ is and what agency teachers should have to exert control over learning. These platform pedagogies do not necessarily represent educational and pedagogic values of public schools and teachers and they are often not transparent to educational professionals.
At the same time, digital platforms’ opaque data practices place children’s privacy at risk and significantly diminish schools’ control over their own data flows. In 2021, several Dutch educational associations carried out a Data Protection Impact Assessment (DPIA) of Google Workspace for Education to investigate whether Google’s data flows comply with the European privacy regulation. Results indicate that Google’s processing of data does not comply with the General Data Protection Regulation and involves significant privacy risks that contest the very legal foundations of the European privacy regulation. To safeguard children’s privacy in classrooms, it is highly important and urgent that data flows and analytics of small-scale edtech platforms are structurally assessed in terms of their privacy impact as well. This is confirmed by a recent global investigation of Human Rights Watch into education technology (EdTech) endorsed by 49 governments for children’s education during the pandemic, which found that many of the online learning platforms “put at risk or directly violated children’s privacy and other children’s rights, for purposes unrelated to their education”.
Yet it is highly important as well to investigate platforms’ pedagogical shaping and ensure we do not reduce the platformization of classrooms to a privacy issue. In addition to the DPIA, and with the aim of aligning classroom digitization with the pedagogical educational climate of the school, Utrecht University and Kennisnet have recently started with the development of a ‘Pedagogical Impact Assessment’ (PIA). The PIA is intended as an instrument to probe digital education platforms on their capacities for shaping teaching and learning, drawing on experiences of teachers and critical examinations of platform architectures. Now classroom practices take concrete shape against and through the built form of cloud-based educational technologies, (D)PIAs are essential for helping teachers, schools and the educational sector at large to seize pedagogical control over their use and to protect children’s right to privacy.
Get new blog posts straight to your inbox
“I am a Robot but I care”: Emotional Ambivalence in AI Toys and its Ethical Implications in the age of Generative AI
The day that Miko3 arrived in summer 2022 was a big day for me and my daughters. For weeks we had waited for customs to clear it, and in my home the i...
International Perspectives: The limits of ‘digital literacy’
Across global organizations, national governments, and professional teaching communities, it seems like one thing educational stakeholders agree on is...
International Perspectives: Children’s rights and parental responsibilities in a digital world
In the growing debate over children’s rights in relation to the digital world, parents may wonder about their role. As technology gets more complex,...
International Perspectives: Ukrainian Child Refugees’ Digital Practices of Belonging
Ukrainian child refugees use a wide repertoire of digital technologies in their everyday practices of belonging to their home country as well as their...