Blogs

International Perspectives: Protection or punishment? Navigating parental control apps for keeping your children safe online

International Perspectives

03 Aug, 2022

Children are spending an unprecedented amount of time online each day via their smartphones and tablets. While the Internet has become an essential enabler for children to learn, have fun, and grow, there could be significant risks associated with children going online, especially as they engage in new digital activities.

For many adults, concerns have increased regarding excessive screen time, as well as more direct risks including cyberbullying, inappropriate or harmful content. In response to children’s increasing exposure to both classes of risks, a new genre of apps, known as parental control apps have emerged. These tools are designed to act as technical mediation support for parents to facilitate access to, and control over, their children’s online activities as a means of protecting them from such harm (Zamen et al). These apps have rapidly grown in popularity over the past few years. The global parental control software market is anticipated to grow from $1.52Bn USD in 2017 to $2.53Bn USD by the end of 2023.

This rapid adoption and increasing reliance on parental control apps have raised corresponding questions about their efficacy: how such apps fit into existing digital parenting practices and habits, and the effects such apps are having on parents and their children. Between July through August of 2020, we identified 241 apps on the UK Google Playstore that were related to “child online safety,” “parental controls,” “parental monitoring,” “cyberbullying,” etc. Our goal was to perform an updated analysis of the most popular Android parenting control apps in the UK market to examine what design features are dominating the current app market maps and how children and parents perceive them (Wang et al 2021).

We found that nearly 80% of these 241 apps had fewer than 20 downloads; while the more popular apps had multi-thousands of downloads. So the popularity of these apps exhibits a pattern of what we call a long tail distribution. We identified 66 apps with at least 10K+ downloads. We further removed some of the apps because they required subscriptions to use, a SIM card, or connections with IoT devices. We ended up with 58 apps which we manually went through one by one to gather data about users’ experiences; we also semi-automatically analysed users’ reviews about these apps that we collected from the app store.

Our analysis showed continued domination of a restriction or monitoring approach employed by the apps: nearly 94% of the apps support screen time logging and having them reported to the parents, and over 90% of the apps provide the function to block apps or children’s screen time. Our analysis of users’ reviews confirmed previous research findings that such approaches were largely disliked by children and their parents because they break children’s trust of their parents and at the same time can be easily circumvented by the children (Wisniewski et al. 2017).

However, more interestingly, our analysis found some different preferences from parents and children regarding how monitoring or restrictions were implemented. Furthermore, the apps we analysed demonstrated some new parenting support features that had been less identified in previous research. These new features have been particularly welcomed by the users because they may provide more feedback to the parents and enable them to establish conversations with their children about various online risks, or they may provide explicit mechanisms to support parents and children in identifying the best way to protect children online jointly.

Better perceived restriction and monitoring app features

Nearly half of the apps we analysed employed the method of `all or nothing’ to restrict what children can or cannot access. For example, parents could either control whether children can access particular websites, watch videos, do certain searches; or they could control nothing at all. Although these solutions may provide a straightforward response to keep children safe, parents and children both disliked the lack of flexibility and found them to cause more conflicts than providing an effective solution to keeping children safe online.

Instead, parents and children preferred those apps providing a “middle ground”, for example, allowing parents to control children’s access based on app/website categories, providing a high level summary of the activities children performed on their devices (instead of all activities), or allowing parents to create a customised list of contacts or sources (instead of blocking all new contacts).

Another set of features that children and parents both struggled with is that many apps may apply restrictions or monitoring of children’s online activities without any explicit notifications (as found in over 30% of our analysed apps). Children felt this both ‘creepy’ and distrusted by their parents. The emerging alternative features were much more transparent, providing children with information about which activities of theirs were being monitored (such as browning history, app use history, device use time etc), or providing a way for children to see what kind of restriction/monitoring rules have been set up by their parents. 

Some other apps take this even further, by providing detailed feedback to children regarding their online activities (such as how much screen time they have had) as well as an explanation about what these activities may mean for them (such as why certain access was being blocked). Children, particularly older children, found such features most helpful for them to develop more healthy digital habits and learn about the implications of their behaviours online. Although these new approaches were still a minority in the apps we analysed (avg ~10%), they were well-perceived by the users and provide some important features for future digital parenting apps/tools to consider.

Better perceived app features for digital parenting support

A small amount of the apps (~15%) in our analysis employed design features that provide dedicated support for digital parenting. Such designs were mainly around supporting or stimulating discussions between parents and children about their online activities, and this was mainly achieved in two ways: first, through features that encourage communication around the restriction and monitoring policies, and through coaching using discussion aides.

Both parents and children preferred the designs that come with ‘high communication support’ – these include designs such as apps enabling children to negotiate their rules with their parents, providing explicit support that encourages parents and children to set screen rules together, and offering advice to parents on strategies ranging from how to apply restriction/monitoring policies to their children to how to talk about sensitive issues. Children, in particular, considered such designs much better – when children were given the chance to communicate and negotiate with their parents, they felt they were part of the decision and they generally respected the rules more. Meanwhile, parents also reported their favour towards feature designs that support or coach them in conversations with their children. And they felt that they were supported by the app without being hijacked by it.

Final words

We hope this analysis will provide some important directions for future parental control app development as well as considerations for parents to choose parental control apps for their own use. There are a number of nuanced factors that can be considered regarding how the apps may affect children’s online experiences and learning how to keep themselves safe. We do recognise that parents may have to make different decisions for children of different ages, different needs or for families facing different practical challenges. However, we hope our study provides some additional dimensions for everyone to think about: what kind of control or monitor mechanisms would work the best for children and their parent, what kind of feedback mechanisms would be appreciated and meaningful, and how could better use be made of the communication and coaching mechanisms out there to support children exploring exciting online opportunities?

References

Bieke Zaman and Marije Nouwen. 2016. Parental controls: advice for parents, researchers and industry. EU Kids Online.

Wang et al. Protection or Punishment? Relating the Design Space of Parental Control Apps and Perceptions about Them to Support Parenting for Online Safety. Proc. ACM Human Computer Interaction. Volume 5. Issue CSCW2. October 2021

Wisniewski et al. 2017. Parental Control vs. Teen Self-Regulation: Is there a middle ground for mobile online safety?. In Proceedings of the 2017 ACM Conf. on Computer Supported Cooperative Work and Social Computing. 51–69.


Get new blog posts straight to your inbox

Join our mailing list

About the author/s

Dr Jun Zhao is a Senior Researcher in the Department of Computer Science at Oxford University. Her research focuses on investigating the impact of algorithm-based decision making upon our everyday life, especially for families and young children. She takes a human-centric approach, focusing on under ... more
Ge Wang is a PhD student in the Department of Computer Science at University of Oxford. Her research investigates the algorithmic impact on families and children, and what that means for their long-term development. Her recent research includes children’s perceptions of the risks around online alg ... more

Recent Blogs



View all blogs

The Australian Research Council Centre of Excellence for the Digital Child acknowledges the First Australian owners of the lands on where we gather and pay our respects to the Elders, lores, customs and creation spirits of this country.

The Centre recognises that the examples we set in diversity and inclusion will support young children to respect and celebrate differences in all people. We embed diversity, inclusivity and equality into all aspects of the Centre’s activities and welcome all people regardless of race, ethnicity, social background, religion, gender, age, disability, sexual orientation and national origin.