Goose illustration

We Are Protected

UNICEF Canada defines protection as ensuring young people are safe both online and offline—at home, school, work, and in their communities. Protection means freedom from bullying, discrimination, exploitation, or harm, while having access to safe environments and trusted people. It also includes opportunities to build skills for managing risks and receiving support, advocacy, and justice when harm occurs. Our research asks: How safe do young people feel online, and how do digital technologies shape their safety in positive or negative ways?

Watch Digital Portraits

Hear directly from Canadian Gen Z youth in these short videos. Filmed 2024–2025.

Click the playlist icon YouTube playlist icon to choose from 30 videos
Caterpillar pointing to playlist icon

Research Snapshot


by: Dr. James Stinson

Mobile devices, social media, online gaming, and AI-mediated platforms have become deeply embedded in youth lives. While these technologies afford opportunities for connection, learning, and self-expression, they also introduce novel risks relating to privacy, harassment, exploitation, and mental health. Understanding these risks — and how technology can be harnessed towards protection — is critical, especially in a Canadian context where youth safety policy, legal regimes, and cultural diversity intersect.

Computer with a smiling face on the screen

This review includes many types of literature from empirical prevalence studies, qualitative accounts, intervention reviews, policy analyses, and technical/AI critiques. It frames the discussion under two broad lenses:

(1) The risks posed by digital technologies to youth safety;

(2) The opportunities by which these same technologies or mediated interventions may bolster protection or resilience.

The final section connects across themes and proposes policy recommendations tailored to the Canadian environment.

Risks of Digital Technology for Youth Safety


Cybervictimization, harassment, and mental health outcomes

There is a substantial body of research that documents how youth in Canada and elsewhere experience cybervictimization, with measurable harms to well-being. Hango (2023) reports prevalence levels for various forms of online harm including harassment, nonconsensual image sharing, and aggressive contact. Kingsbury and Arim (2023) explicitly show that those who report online victimization fare worse on well-being metrics and have poorer mental health outcomes, including — depression, anxiety, self-harm ideation. Focusing on cyber sexual violence, Pashang, Khanlou, & Clarke (2019) show how nonconsensual dissemination of images or exploitative contact can be deeply injurious to identity, self-esteem, and emotional safety, often inducing shame and isolation. Vaillancourt, Faris, & Mishna (2017) argue that the health and clinical implications of cyberbullying warrant active engagement from psychological and psychiatric disciplines. These harms are not isolated but often co-occur with offline distress and trauma (Vaillancourt et al., 2017).

Canadian surveys and qualitative studies deepen the context.  National data on children’s self-reported experiences of cyberbullying indicate that many youth have been targets (Beran, Mishna, McInroy, & Shariff, 2015). Holfeld and Leadbeater (2015) also find that even in early adolescence, cyberbullying behaviors and victimization are present: risk begins early. Holfeld & Mishna (2018) trace longitudinal relationships: children may shift roles over time (victim, bully, bystander), and early involvement can predict later risk. These dynamics point to persistent patterns and reciprocal relationships between online misconduct and youth social roles.

Content, Contact, Conduct, and Contract Risks

Jang and Ko (2023) adopt the 4Cs framework — content, contact, conduct, and contract — to categorize online risks and align them with policy responses. Content risks means being exposed to harmful or inappropriate content (e.g., violent, sexual, self-harm related). Contact risks involve interactions with strangers, like grooming, solicitation, or unwanted communications. Conduct risks include cyberbullying, harassment, doxxing, trolling, and reputational aggression. Contract risks cover manipulative contracts, deceptive advertising, in-app purchases, privacy breaches, and algorithmic data collection. Many empirical studies map onto these risk domains.

For instance, youth exposing themselves to harmful sexual content or disturbing images face content risks (Jang & Ko, 2023). Youth contacting or being contacted by unknown adults (grooming, solicitation) manifest contact risks. Cyberbullying and harassment fall under conduct, as shown in quantitative and qualitative work (Espelage & Hong, 2017; Beran et al., 2015). Meanwhile, contract risks become salient in discussions of privacy, data mining, and monetization features. Bowen-Forbes et al. (2024) undertook a scoping review of mobile safety apps aimed at at-risk youth, and noted that these apps themselves may collect data or entangle youth in surveillance trade-offs — thus potentially reintroducing contract-type risks.

In gaming and virtual worlds, contract risks become even more intricate. Monetization strategies - like , loot boxes, microtransactions, or gambling-like features can push youth into exploitative economic relationships (Kou, Hernandez, & Gui, 2025).

Looking at AI moderation in child-centric platforms like Roblox, Chawki (2025) and Choi, Choi, & Seering (2025) found that moderation serves to mediate content and conduct risks. But, AI moderation also raises issues about transparency, mistakes (false positives/negatives), contextual sensitivity, and accountability. Balancing moderation, youth agency, and freedom of expression is a delicate one.

Cream-colored starburst shape on black background.

Privacy, surveillance, and agency

Many scholars draw attention to youth privacy, surveillance, and the trade-offs of safety. Shade & Shepherd (2013) propose thinking of youth mobile privacy through a digital policy literacy lens, emphasizing that youth should be capable of understanding not just device features but the policies and systems that govern data collection and surveillance. Shade & Chan (2020) develop a Canadian framing of digital privacy policy literacy, arguing that awareness of platform practices, regulatory regimes, and data flows is crucial for resilient navigation.

Adorjan & Ricciardelli (2019) explore how youth exercise agency in their own privacy practices even in the face of pervasive surveillance. Many youth actively delete posts, manage privacy settings, or control who sees what by compartmentalizing/curating their networks (despite narratives of the “privacy paradox”). In our study, many Canadian youth interviewed describe using Facebook for family and community, and posting certain content for that audience that may be different or distilled from content they post to their friends-network on Instagram.

Akter et al. (2022) propose moving from strict parental control to joint family oversight, where parents and teens negotiate privacy and safety more equitably. Ghosh et al. (2018) explore how youth perceive this relationship: children often feeling parental monitoring apps straddle safety and surveillance, sometimes leaning too heavily into invasive surveillance practices.

Some call for technical solutions, like age verification, identity assurance, and content gating. But these systems introduce their own threats, like: false rejection (blocking someone who should get access), exclusion (leaving out people the system doesn’t work well for, like those without ID), data breaches (personal information is exposed), or circumvention (finding ways to get around the rules).

Forland, Meysenburg, & Solis (2024) and Jarvie & Renaud (2024) analyze these challenges in depth, emphasizing the difficulty of designing age verification systems that protect youth while safeguarding privacy and proportionality. Adib, Zhu, & Ahmad (2023) highlight the interplay of technical, regulatory, and social dynamics in the Canadian context, reminding us that implementing robust age checks is socially and politically complex.

Marginalization, Equity, and Structural Vulnerability

The risks of digital technology are not evenly distributed. Sam et al. (2018) provide a qualitative view into Aboriginal youth experiences with cyberbullying and mentoring: they underscore how small communities, visibility, and cultural contexts exacerbate harm and complicate reporting. Ricciardelli & Adorjan (2019) emphasize gendered asymmetries: when girls’ sexual images circulate, social condemnation and reputational harm are often more severe than for boys. Marginalized youth — by identity, socioeconomics, geography, or ability — may lack resources, anonymity, or institutional support.

Media coverage and legal responses have sometimes exacerbated panic or moral panic, rather than nuanced protection. Felt (2015) critiques how media portrayals shaped Canadian cyberbullying law in overdramatic terms. Deschamps & McNutt (2016) caution that legal frameworks may overreach or misdiagnose the problem. Bailey (2016) reviews Canadian cyberbullying law and notes ambiguities, enforcement gaps, and reactive rather than preventative orientation. Coburn, Connolly, & Roesch (2015) debate whether criminalization is effective or potentially counterproductive.

A blue spiral design on a black background.

Opportunities of Technology to Promote Youth Safety


Digital resilience, peer support, and communication

Digital technologies can also act as tools for connection, resilience, and help-seeking. Steeves, McAleese, & Brisson-Boivin (2020) examine how youth and parents talk about online resilience: youth develop strategies (blocking, reporting, curating networks) and often appreciate guidance and open communication. Riazi et al. (2023) present youth voices during COVID-19: many emphasized that “the most important thing is to communicate” — that supportive connection, online or offline, helps buffer distress during isolation. Kaur et al. (2021) describe the Youth First initiative: a youth-led Canadian project during the pandemic that leveraged peer support, resource sharing, and mutual aid.

Especially under conditions where in-person support was constrained, digital platforms allowed youth to maintain social ties, share coping strategies, and access mental health resources. In many cases, these platforms provided the only viable mode for outreach or mutual aid.

Safety Apps, Protective Tools, and Mediated Interventions

Technology itself can be designed to mitigate risk. Bowen-Forbes et al. (2024) review mobile apps designed to support personal safety for at-risk youth. These may include features like panic alerts, geolocation sharing, emergency contacts, or resource directories. While concerns about privacy and data collection are present (as earlier noted), well-designed apps can offer tangible safety benefits, especially for youth in precarious circumstances.

In gaming and virtual spaces, designers can embed safety-by-design features: moderation tools, reporting systems, content filters, age gating, or developer oversight (Choi et al., 2025). For instance, the work by Choi, Choi, & Seering (2025) encourages augmented design landscapes in Roblox that promote safer collaboration among teen developers by enabling safe social norms and protective scaffolding. Moderation algorithms, especially when supplemented by human oversight, can reduce harmful content or predatory behavior (Chawki, 2025). However, their success hinges on transparency, responsiveness to youth feedback, and contextual sensitivity to developmental and cultural factors.

AI and algorithmic tools also hold promise: detection of grooming language, flagging nonconsensual image sharing, and pattern recognition of harassment. But as noted earlier, these AI tools must be carefully calibrated to avoid overblocking, bias, or chilling legitimate expression (Chawki, 2025). Livingstone et al. (2025) push a child rights approach to designing digital play, arguing that technology platforms should embed rights-sensitive mechanisms — including participatory design, transparency, and contextually appropriate constraints.

Four stylized green chalk-like marks on a black background, two longer and two shorter.

Education, Intervention, and Digital Citizenship

A number of sources address the potential of well-designed education and intervention to reduce harm. Finkelhor et al. (2021) stress that youth internet-safety education must align with the evidence base: curricula should be developmentally appropriate, evaluated rigorously, and integrated with other social supports. The systematic reviews by Mishna et al. (2009, 2011) assess interventions for youth, parents, and schools: they find modest positive effects, particularly when parental involvement, social skills training, and online safety components are combined. However, methodological limitations abound (short durations, small samples, weak controls).

Espelage & Hong (2017) review broader cyberbullying prevention efforts, highlighting promising strategies such as school policies, peer mediation, restorative practices, digital citizenship curricula, and fostering positive online norms. Hendry et al. (2023) emphasize stakeholder engagement in Western Canada and advocate for context-sensitive, community-informed prevention strategies. Vaillancourt et al. (2017) argue for integration of mental health and cyberharm prevention in clinical settings — screening youth who display distress for online harm exposure. Together, these suggest that digital safety education and intervention can protect youth, particularly when multi-component and culturally attuned.

Policy, Regulation, and Governance Levers

Technology-enabled policy instruments also matter. Jang & Ko (2023) compare digital policies across Canada, the UK, and Australia via the 4Cs framework, indicating how regulation, platform mandates, and statutory protections can shape online risk exposure. Namvarpour et al. (2025) analyze news media coverage of youth online safety, showing how public discourse influences policy prioritization and threat framing.

More specifically, legal frameworks around age verification, platform obligations, and content moderation are emerging as levers. Forland et al. (2024), Jarvie & Renaud (2024), and Adib et al. (2023) explore age verification methods, discussing how responsible, regulated systems can restrict youth exposure to harmful content while preserving privacy and access. Platforms might be held responsible for safety-by-design, transparency mandates, or reporting obligations. Shanmugam & Findlay (2025) argue for co-creation of regulation between youth and policymakers, ensuring rules reflect lived realities rather than top-down templates.

Additionally, media literacy, policy literacy, and youth voice promotion can shift the balance of power: youth equipped to contest platform practices and articulate their needs can better hold institutions accountable. This triple approach — education, regulation, technological design — underscores that improving safety is not only about suppressing harm but also about empowering youth to navigate, contest, and co-shape digital spaces.

Summary and Policy Recommendations


This review underscores that digital technologies present a complex mix of risks and potential protective affordances for youth in Canada. On the risk side, youth face cybervictimization, harassment, grooming, exposure to harmful content, privacy violation, reputational harm, and mental health distress. These risks are mediated through content, contact, conduct, and contract channels (Jang & Ko, 2023). Agency, surveillance, and privacy represent critical fault lines: youth may actively try to manage risk, but often are constrained by opaque platform practices or parental oversight regimes. These challenges are further exacerbated for marginalized youth (e.g. Indigenous youth, girls, racialized youth) who often face structural vulnerabilities and fewer supports.

On the opportunity side, digital spaces offer vital connection, peer support, help-seeking channels, and continuity of social life — especially during challenging circumstances (e.g., during COVID-19). Safety-oriented apps, moderator systems, AI detection tools, curated virtual environments, and embedding safety in design all offer routes to reduce harm, though not without trade-offs. Education, intervention, and digital citizenship training can build resilience, digital literacy, and social norms of care. Meanwhile, regulatory and policy levers — such as age verification, mandatory safety features on platforms, oversight, and youth co-design — hold potential to shift the systemic environment.

From this synthesis, the following policy recommendations emerge to bolster youth safety in the Canadian context:

  1. Youth-led co-design and policymaking 
    Mandate the inclusion of youth voices (diverse in age, identity, location) in designing safety features, regulation, and educational programs (Shanmugam & Findlay, 2025). Youth input ensures that safety interventions respond to lived experience rather than top-down assumptions.

  2. Comprehensive digital policy and literacy education 
    Incorporate modules at all school levels on digital policy literacy, privacy practices, algorithmic systems, surveillance, and data rights (Shade & Shepherd, 2013; Shade & Chan, 20220; Finkelhor et al., 2021). This helps youth move beyond reactive safety tactics to structural understanding.

  3. Scaffolded oversight via joint family models
    Promote models where parents and teens negotiate shared oversight (not unilateral monitoring) to balance safety and autonomy (Akter et al., 2022). Encourage family communication, joint rule-making, and open dialogue.

  4. Rigorous evaluation and longitudinal intervention investment
    Fund large-scale, longitudinal, controlled studies of digital safety interventions (education curricula, apps, moderation tools) to identify what works under Canadian conditions (Mishna et al., 2009, 2011; Espelage & Hong, 2017). Prioritize replication, sustained follow-up, and cross-provincial comparisons.

  5. Platform accountability and safety-by-design mandates
    Enact regulatory standards requiring platforms to embed child-sensitive safety features (reporting, moderation, age gating, transparency) and to report metrics (e.g. removal rates, appeals) to oversight bodies. Moderation systems should combine AI and human review and allow youth appeals (Chawki, 2025; Livingstone et al., 2025; Jarvie & Renaud, 2024).

  6. Privacy-respecting age assurance systems
    Pilot and regulate age verification systems that minimize data exposure, adopt cryptographic proofs or proxy verification, and allow recourse (Forland et al., 2024; Jarvie & Renaud, 2024; Adib et al., 2023). Government-platform partnerships may be necessary to balance safety with rights.

  7. Equity-informed protections and outreach
    Tailor safety initiatives to priority populations (Indigenous youth, rural/remote youth, girls, racialized youth) with culturally adapted programs, linguistic access, and trusted local supports (Sam et al., 2018; Ricciardelli & Adorjan, 2019; Hendry et al., 2023). Ensure resources reach underserved communities.

  8. Integrated response systems and cross-sector collaboration
    Develop coordinated protocols among education, mental health, child welfare, law enforcement, and youth services for responding to serious online harm, with trauma-informed pathways and referral networks (Vaillancourt et al., 2017; Hendry et al., 2023).

  9. Continuous monitoring and adaptive regulation
    Establish a permanent digital safety observatory in Canada to monitor emerging risks (e.g. metaverse, AI bots, algorithmic harassment), collect youth feedback, analyze trends, and dynamically update policy (Namvarpour et al., 2025; Reed & Joseff, 2023). Policy must stay agile as technologies evolve.

  10. Promote safe digital communities and peer support networks
    Support youth-led platforms, peer moderation, trusted reporting networks, and resilience communities (e.g. Youth First, Kaur et al., 2021). Encouraging positive norms and collective safety can amplify the protective potential of digital spaces.

By combining robust education, participatory design, strong regulation, careful technical implementation, and equity sensitivity, Canada can better harness the promise of mobile, social, gaming, and AI platforms — while minimizing their perils — to create safer digital ecosystems in which youth can flourish rather than merely survive.

A white starburst shape with uneven, irregular arms on a black background.

How to Cite this Text: Stinson, J. (2025). We Are Protected. In Digital Wellbeing Hub. Young Lives Research Lab. https://www.digitalwellbeinghub.ca/we-are-protected — Funded by the Government of Canada.

Resources

*

Resources *

Explore Resources By:

Research – Reports, articles, findings, background reading

For Youth – Interactive tools, guides, and activities designed for youth

Supporting Youth – Resources for educators, parents, policymakers, and practitioners

“purple

Digital and Privacy Rights: Resources for Children, Youth, Educators, & Parents

Information and Privacy Commissioner of Ontario

Resources for advancing digital rights, and preparing to be safe and responsible digital citizens.

Explore
Impact of Regulation on Children’s Digital Lives

Impact of Regulation on Children’s Digital Lives

Digital Futures for Children

How UK/EU privacy laws shifted platforms toward safer defaults for kids—private accounts, limits on targeting—and what still needs work.

Read the Report
On Compliance

On Compliance

5Rights Foundation

Articles, briefs, and press on making digital services meet children’s rights and regulatory standards.

Learn More
“non-binary

Avoiding Fraud: Common Frauds and Scams

Canadian Securities Administrators

Learn about common frauds and scams and how to spot them.

Learn More
Child Online Safety Toolkit

Child Online Safety Toolkit

5Rights Foundation

Practical guidance for policymakers and advocates to build age-appropriate, safe-by-design digital spaces.

Get the Toolkit
Building Resilience to Online Misinformation

Building Resilience to Online Misinformation

MediaSmarts (Canada)

Canadian research on how misinformation spreads—and strategies that help youth recognize and resist it.

Read the Report
Fact Checking!

Fact Checking!

MediaSmarts

Hands-on tools and videos—like “Break the Fake” and the Teen Fact-Checking Network—to verify claims and stop the spread.

Explore Resources
UNICEF Online Safety

Online Safety Resources

UNICEF Australia

Guides for parents and teachers to help kids navigate risks, report harm, and stay safe online.

View Guides
“man

Manipulation Nation

Digital Public Square

An educational tool that provides people with skills to identify manipulative content and tactics online.

Play Now
Who’s Watching? Podcast

Podcast: Who’s Watching? Surveillance Capitalism

5Rights Foundation – Podcast

How data-driven business models affect children’s rights and democracy, in plain language.

Listen Now
Convention on the Rights of the Child

Convention on the Rights of the Child

United Nations

The global treaty protecting children from harm and upholding their rights—offline and online.

Read the Convention
Children’s Rights in the Digital Age – In Our Own Words

Children’s Rights in the Digital Age – In Our Own Words

5Rights Foundation

A youth version of General Comment No. 25—clear explanations of digital rights and how to use them.

Read Now
MediaSmarts – Canada’s Centre for Digital Media Literacy

MediaSmarts

Canada’s Centre for Digital Media Literacy

Programs and classroom-ready lessons to build critical thinking, privacy awareness, and safer online habits.

Visit Site
Internet Matters

Internet Matters

Guides for Parents & Educators

Step-by-step safety guides, device controls, and advice for tackling bullying, grooming, and other online risks.

View Guides
Centre for Humane Technology

Centre for Humane Technology

Designing for Wellbeing & Safety

Research and tools to reduce harms from attention-driven platforms and support healthier tech ecosystems.

Explore
Centre for Digital Rights

Centre for Digital Rights

Advocacy & Policy

Legal and policy initiatives to protect people—especially children—from exploitative digital practices.

Visit Site
Media Ecosystem Observatory

Media Ecosystem Observatory

MEO (Canada)

Independent research tracking misinformation and harmful narratives—useful for educators and policy makers.

Learn More
Digital Futures for Children

Digital Futures for Children

Rights-Respecting Digital World

A research hub that amplifies children’s voices and builds the evidence base for safer, rights-based digital design.

Visit Site
NeedHelpNow.ca

Need Help Now: Removing Sexual Photos

Canadian Centre for Child Protection Inc.

And information about sextortion. If you are worried a nude of you under the age of 18 is being shared online, we are here to help. There are steps you can take to regain control, and people to support you. You don't have to deal with this by yourself.

Get Help Now
Click if you Agree

Click if you Agree

MediaSmarts

An educational game, designed for young people between the ages of 12 and 14, and a helpful tool for all ages. Develop the skills and confidence to read privacy policies and terms of use instead of blindly clicking on the “I agree”.

Play Now
How Cyber-Savvy Are You?

How Cyber-Savvy Are You?

MediaSmarts

Test your cyber-security awareness with Media Smarts’ quiz and explore tip sheets.

Go to Quiz
Advice: Supporting Neurodivergent Children and Young People

Advice: Supporting Neurodivergent Children and Young People

InternetMatters

Tailored guidance for neurodivergent youth on safe, supportive online connection. Explore guides to support keeping them safe. Notes: 1) InternetMatters is partially funded and in partnership with big-tech/big-media, potential for bias. 2) As our research snapshot notes, be mindful that sometimes efforts to ‘protect’ kids online become invasive surveillance that harms child/youth agency. Co-creating rules and protection measures with young people is advised.

Explore Guides Now
Statistics Canada Infographic

Infographic: Young people and exposure to harmful online content

Statistics Canada

See infographic and statistics on young people’s exposure in Canada in 2022. Data released in 2024.

Read Now
CIRA Canadian Shield

Canadian Shield: Free Public DNS Resolver

Canadian Internet Registration Authority (CIRA)

This free, easy-to-use security tool is available as a mobile app or browser extension, giving you protection across all your devices. Built by and for Canadians.

Learn More
Using Parental Controls – Tip Sheet

Tip Sheet - Using Parental Controls

Media Smarts

A guide to parental controls on Android and Apple for many platforms including search engines, social media (Tiktok, Snapchat, Instagram), and games (Roblox, Minecraft, etc.)

Read Now
Podcast: Who’s Watching? Part 1

Podcast: Who’s Watching? How Tech Companies Profit from Children’s Data

5Rights Foundation

5Rights Youth Ambassador Srishti speaks with Prof. Shoshana Zuboff, author of The Age of Surveillance Capitalism, to uncover how tech companies like Google and Meta harvest young people’s data, manipulate our behaviour, and turn childhood into a marketplace.

Listen Now
Blacklight: Website Privacy Investigator

Blacklight: A Real-Time Website Privacy Investigator

The Markup

Who is peeking over your shoulder while you work, watch videos, learn, explore, and shop on the internet? Enter the address of any website, and Blacklight will scan it and reveal the specific user-tracking technologies on the site—and who’s getting your data.

Explore Now
The Markup – Newsroom

The Markup: Newsroom

The Markup

The Markup is a non-profit newsroom that challenges technology to serve the public good. They use a scientific approach that includes building datasets from scratch and showing their work. Explore investigations, tools, and blueprints.

Visit Newsroom
“Canadian

Get Smart About Money

Ontario Securities Commission

Improve your financial knowledge with free, unbiased resources. Learn about investing, spotting AI scams, access cash flow and savings calculators, and more.

Learn More

Explore More Dimensions of Wellbeing

Complete the Survey!

Join the research conversation and see what others are saying (no email required).

View Live Results
Illustration of young people around the globe
illustrations by: Zakirah Allain
Looking up at a tall, old tree in a forest with vibrant yellow and orange autumn leaves.

photo by: Hayden Pinchin

Works Cited


Abarbanel, B., Gainsbury, S. M., King, D., Hing, N., & Delfabbro, P. H. (2017). Gambling games on social platforms: How do advertisements for social casino games target young adults? Policy & Internet, 9(2), 184–209.

Adib, A., Zhu, W. P., & Ahmad, M. O. (2023, November). Improvising age verification technologies in Canada: Technical, regulatory and social dynamics. In 2023 IEEE International Humanitarian Technology Conference (IHTC) (pp. 1–6). IEEE. https://doi.org/10.1109/IHTC58010.2023.10409376

Adorjan, M., & Ricciardelli, R. (2018). Cyber-risk and youth: Digital citizenship, privacy and surveillance. Routledge.

Adorjan, M., & Ricciardelli, R. (2019). A new privacy paradox? Youth agentic practices of privacy management despite “nothing to hide” online. Canadian Review of Sociology / Revue canadienne de sociologie, 56(1), 8–29.

Adorjan, M., Ricciardelli, R., & Saleh, T. (2022). Parental technology governance: Teenagers’ understandings and responses to parental digital mediation. Qualitative Sociology Review, 18(2), 112–130.

Adorjan, M., & Ricciardelli, R. (2019). Student perspectives towards school responses to cyber-risk and safety: The presumption of the prudent digital citizen. Learning, Media and Technology, 44(4), 430–442. https://doi.org/10.1080/17439884.2019.1640741

Agosto, D. E., & Abbas, J. (2017). “Don’t be dumb—that’s the rule I try to live by”: A closer look at older teens’ online privacy and safety attitudes. New Media & Society, 19(3), 347–365. https://doi.org/10.1177/1461444815606121

Akter, M., Godfrey, A. J., Kropczynski, J., Lipford, H. R., & Wisniewski, P. J. (2022). From parental control to joint family oversight: Can parents and teens manage mobile online safety and privacy as equals? Proceedings of the ACM on Human-Computer Interaction, 6(CSCW1), Article 1. https://doi.org/10.1145/3512956

Álvarez-García, D., García, T., & Suárez-García, Z. (2018). The relationship between parental control and high-risk internet behaviours in adolescence. Social Sciences, 7(6), Article 87. https://doi.org/10.3390/socsci7060087

Andrews, J. C., Walker, K. L., Netemeyer, R. G., & Kees, J. (2023). Helping youth navigate privacy protection: Developing and testing the children’s online privacy scale. Journal of Public Policy & Marketing, 42(3), 223–241.

Bailey, J. (2016). Canadian legal approaches to ‘cyberbullying’ and cyberviolence: An overview (Ottawa Faculty of Law Working Paper No. 2016-37). https://doi.org/10.2139/ssrn.2797272

Baldry, A. C., Sorrentino, A., & Farrington, D. P. (2019). Cyberbullying and cybervictimization versus parental supervision, monitoring and control of adolescents’ online activities. Children and Youth Services Review, 96, 302–307. https://doi.org/10.1016/j.childyouth.2018.11.058

Barnick, H., Campbell, V. M., & Tilleczek, K. C. (2019). The way we live now: Privacy, surveillance, and control of youth in the digital age. In Youth in the Digital Age (pp. 60–79). Routledge.

Beran, T., Mishna, F., McInroy, L. B., & Shariff, S. (2015). Children’s experiences of cyberbullying: A Canadian national study. Children & Schools, 37(4), 207–214. https://doi.org/10.1093/cs/cdv017

Bowen-Forbes, C., Khondaker, T., Stafinski, T., Hadizadeh, M., & Menon, D. (2024). Mobile apps for the personal safety of at-risk children and youth: Scoping review. JMIR mHealth and uHealth, 12(1), e58127. https://doi.org/10.2196/58127

Broll, R., & Reynolds, D. (2021). Parental responsibility, blameworthiness, and bullying: Parenting style and adolescents’ experiences with traditional bullying and cyberbullying. Criminal Justice Policy Review, 32(5), 447–468. https://doi.org/10.1177/0887403419851865

Cassidy, W., Faucher, C., & Jackson, M. (2018). “You need a thick skin…”: Impacts of cyberbullying at Canadian universities. In Cyberbullying at University in International Contexts (pp. 112–125). Routledge.

Chawki, M. (2025). AI moderation and legal frameworks in child-centric social media: A case study of Roblox. Laws, 14(3), Article 29. https://doi.org/10.3390/laws14030029

Choi, Y., Choi, J., & Seering, J. (2025, April). Leveling up together: Fostering positive growth and safe online spaces for teen Roblox developers. In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems (pp. 1–18). ACM. (DOI pending / not listed in original)

Coburn, P. I., Connolly, D. A., & Roesch, R. (2015). Cyberbullying: Is federal criminal legislation the solution? Canadian Journal of Criminology and Criminal Justice, 57(4), 566–579. https://doi.org/10.3138/cjccj.2014.E31

Davis, I. S., Thornburg, M. A., Patel, H., & Pelham, W. E., III. (2024). Digital location tracking of children and adolescents: A theoretical framework and review. Clinical Child and Family Psychology Review, 27(4), 943–965. https://doi.org/10.1007/s10567-023-00493-8

Deschamps, R., & McNutt, K. (2016). Cyberbullying: What’s the problem? Canadian Public Administration, 59(1), 45–71. https://doi.org/10.1111/capa.12165

De Simone, E., Rollo, S. & Rochira, A. Digital Ambivalence in the Online Era: A Social-Ecological Study of Parental Monitoring. J Child Fam Stud 34, 2290–2304 (2025). https://doi.org/10.1007/s10826-025-03138-4

Erickson, L. B., Wisniewski, P., Xu, H., Carroll, J. M., Rosson, M. B., & Perkins, D. F. (2016). The boundaries between: Parental involvement in a teen’s online world. Journal of the Association for Information Science and Technology, 67(6), 1384–1403. https://doi.org/10.1002/asi.23450

Espelage, D. L., & Hong, J. S. (2017). Cyberbullying prevention and intervention efforts: Current knowledge and future directions. The Canadian Journal of Psychiatry, 62(6), 374–380. https://doi.org/10.1177/0706743716684793

Felt, M. (2015). The incessant image: How dominant news coverage shaped Canadian cyberbullying law. University of New Brunswick Law Journal, 66, 137–166.

Finkelhor, D., Walsh, K., Jones, L., Mitchell, K., & Collier, A. (2021). Youth internet safety education: Aligning programs with the evidence base. Trauma, Violence, & Abuse, 22(5), 1233–1247. https://doi.org/10.1177/1524838019843191

Forland, S., Meysenburg, N., & Solis, E. (2024). Age verification: The complicated effort to protect youth online. New America. https://www.newamerica.org

Freed, D., Bazarova, N. N., Consolvo, S., Han, E. J., Kelley, P. G., Thomas, K., & Cosley, D. (2023, April). Understanding digital-safety experiences of youth in the US. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (pp. 1–15). ACM. https://doi.org/10.1145/3544548.3580965

Ghosh, A. K., Badillo-Urquiola, K., Guha, S., LaViola, J. J., & Wisniewski, P. J. (2018, April). Safety vs. surveillance: What children have to say about mobile apps for parental control. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (pp. 1–14). ACM. https://doi.org/10.1145/3173574.3173696

Ghosh, A. K., Badillo-Urquiola, K., Rosson, M. B., Xu, H., Carroll, J. M., & Wisniewski, P. J. (2018, April). A matter of control or safety? Examining parental use of technical monitoring apps on teens’ mobile devices. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (pp. 1–14). ACM. https://doi.org/10.1145/3173574.3173697

Greyson, D., Chabot, C., Mniszak, C., & Shoveller, J. A. (2023). Social media and online safety practices of young parents. Journal of Information Science, 49(5), 1344–1357. https://doi.org/10.1177/01655515221104468

Hango, D. (2023). Online harms faced by youth and young adults: The prevalence and nature of cybervictimization. Statistics Canada. https://www150.statcan.gc.ca/n1/pub/85-002-x/2023001/article/00001-eng.htm

Hango, D. W. (2016). Cyberbullying and cyberstalking among Internet users aged 15 to 29 in Canada. Statistics Canada.https://www.researchgate.net/publication/311754653_Cyberbullying_and_cyberstalking_among_Internet_users_aged_15_to_29_in_Canada

Hasinoff, A. A. (2017). Where are you? Location tracking and the promise of child safety. Television & New Media, 18(6), 496–512. https://doi.org/10.1177/1527476416647494

Hendry, B. P., Hellsten, L. A. M., McIntyre, L. J., & Smith, B. R. (2023). Recommendations for cyberbullying prevention and intervention: A Western Canadian perspective from key stakeholders. Frontiers in Psychology, 14, Article 1067484. https://doi.org/10.3389/fpsyg.2023.1067484

Holfeld, B., & Leadbeater, B. J. (2015). The nature and frequency of cyberbullying behaviors and victimization experiences in young Canadian children. Canadian Journal of School Psychology, 30(2), 116–135. https://doi.org/10.1177/0829573514542216

Holfeld, B., & Mishna, F. (2018). Longitudinal associations in youth involvement as victimized, bullying, or witnessing cyberbullying. Cyberpsychology, Behavior, and Social Networking, 21(4), 234–239. https://doi.org/10.1089/cyber.2017.0369

Jarvie, C., & Renaud, K. (2024). Online age verification: Government legislation, supplier responsibilization, and public perceptions. Children, 11(9), Article 1068. https://doi.org/10.3390/children11091068

Jang, Y., & Ko, B. (2023). Online safety for children and youth under the 4Cs framework—A focus on digital policies in Australia, Canada, and the UK. Children, 10(8), Article 1415. https://doi.org/10.3390/children10081415

Karadimce, A., & Bukalevska, M. (2023). Threats targeting children on online social networks. WSEAS Transactions on Advances in Engineering Education, 20(4), 25–31.

Kaur, H., Howe, C., Whitty, J., Quigley, K., Katerenchuk, R., Bonnell, V., … Pearson, G. (2021). Youth First: A Canadian youth-led initiative in the midst of the COVID-19 pandemic. Canadian Journal of Children’s Rights / Revue canadienne des droits des enfants, 8(1), 136–152.

Katapally, T. R., Thorisdottir, A. S., Laxer, R., & Leatherdale, S. T. (2018). The association of school connectedness and bullying involvement with multiple screen-time behaviours among youth in two Canadian provinces: A COMPASS study. Chronic Diseases and Injuries in Canada (Promotion de la santé et prévention des maladies chroniques au Canada), 38(10).

Khanta, K. (2024). Protecting the Innocent: Safeguarding Children in Immersive Online Gaming Environments. Marymount University.

Khurana, A., Bleakley, A., Jordan, A. B., & Romer, D. (2015). The protective effects of parental monitoring and internet restriction on adolescents’ risk of online harassment. Journal of Youth and Adolescence, 44(5), 1039–1047.

Kou, Y., Hernandez, R. H., & Gui, X. (2025, April). “The system is made to inherently push child gambling in my opinion”: Child safety, monetization, and moderation on Roblox. In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems (pp. 1–18). 

Law, D. M., Shapka, J. D., & Olson, B. F. (2010). To control or not to control? Parenting behaviours and adolescent online aggression. Computers in Human Behavior, 26(6), 1651–1656. https://doi.org/10.1016/j.chb.2010.05.024

Li, Q. (2008). Cyberbullying in schools: An examination of preservice teachers’ perception. Canadian Journal of Learning and Technology, 34(2), n2.

Liau, A. K., Khoo, A., & Ang, P. H. (2008). Parental awareness and monitoring of adolescent Internet use. Current Psychology, 27(4), 217–233. https://doi.org/10.1007/s12144-008-9036-6

Lionetti, F., Palladino, B. E., Moses Passini, C., Casonato, M., Hamzallari, O., Ranta, M., … Keijsers, L. (2019). The development of parental monitoring during adolescence: A meta-analysis. European Journal of Developmental Psychology, 16(5), 552–580.

Livingstone, S. (2013). Online risk, harm and vulnerability: Reflections on the evidence base for child Internet safety policy. ZER: Journal of Communication Studies, 18(35), 13–28.

Livingstone, S., Nair, A., Stoilova, M., van der Hof, S., & Caglar, C. (2024). Children’s rights and online age assurance systems: The way forward. The International Journal of Children’s Rights, 32(3), 721–747.

Livingstone, S., Ólafsson, K., & Pothong, K. (2025). Digital play on children’s terms: A child-rights approach to designing digital experiences. New Media & Society, 27(3), 1465–1485.

Maqsood, S., & Chiasson, S. (2021, May). “They think it’s totally fine to talk to somebody on the internet they don’t know”: Teachers’ perceptions and mitigation strategies of tweens’ online risks. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1–17). ACM.

Martins, M. V., Formiga, A., Santos, C., Sousa, D., Resende, C., Campos, R., … Ferreira, S. (2020). Adolescent internet addiction – role of parental control and adolescent behaviours. International Journal of Pediatrics and Adolescent Medicine, 7(3), 116–120.

McInroy, L. B., & Mishna, F. (2017). Cyberbullying on online gaming platforms for children and youth. Child and Adolescent Social Work Journal, 34(6), 597–607. https://doi.org/10.1007/s10560-017-0494-8

Mhavan, N. V., & Singh, P. R. (2026). A safe digital future: An integrated approach to parental controls in social media and gaming platforms. In Integrating Parental Consent and Child Engagement With Digital Protection Rules (pp. 137–162). IGI Global Scientific Publishing.

Mitchell, K. J., Finkelhor, D., & Wolak, J. (2005). Protecting youth online: Family use of filtering and blocking software. Child Abuse & Neglect, 29(7), 753–765. https://doi.org/10.1016/j.chiabu.2004.12.006

Montgomery, K. C., Chester, J., & Milosevic, T. (2017). Ensuring young people’s digital privacy as a fundamental right. In International Handbook of Media Literacy Education (pp. 85–102). Routledge.

Namvarpour, M., Aghakhani, E., Ekstrand, M. D., Rezapour, R., & Razi, A. (2025, May). The evolving landscape of youth online safety: Insights from news media analysis. In Proceedings of the 17th ACM Web Science Conference (pp. 261–271). 

Padilla-Walker, L. M., Coyne, S. M., Kroff, S. L., & Memmott-Elison, M. K. (2018). The protective role of parental media monitoring style from early to late adolescence. Journal of Youth and Adolescence, 47(2), 445–457. https://doi.org/10.1007/s10964-017-0740-4

Paat, Y. F., & Markham, C. (2021). Digital crime, trauma, and abuse: Internet safety and cyber risks for adolescents and emerging adults in the 21st century. Social Work in Mental Health, 19(1), 18–40. https://doi.org/10.1080/15332985.2021.1881974

Pashang, S., Khanlou, N., & Clarke, J. (2019). The mental health impact of cyber sexual violence on youth identity. International Journal of Mental Health and Addiction, 17(5), 1119–1131. https://doi.org/10.1007/s11469-018-9956-3

Reed, N., & Joseff, K. (2023). Kids and the Metaverse. Common Sense Media. https://www.commonsensemedia.org/sites/default/files/featured-content/files/metaverse-white-paper-1.pdf

Ricciardelli, R., & Adorjan, M. (2019). “If a girl’s photo gets sent around, that’s a way bigger deal than if a guy’s photo gets sent around”: Gender, sexting, and the teenage years. Journal of Gender Studies, 28(5), 563–577. https://doi.org/10.1080/09589236.2018.1563889

Richmond, J. O. (2026). The importance of parental monitoring and its impact on children’s online safety. In Integrating Parental Consent and Child Engagement With Digital Protection Rules (pp. 251–280). IGI Global Scientific Publishing.

Riazi, N. A., Goddard, J., Lappin, S., Michaelson, V., Wade, T. J., & Patte, K. A. (2023). “The most important thing is to communicate with students”: Experiences and voices of Canadian youth during the COVID-19 pandemic. International Journal of Adolescence and Youth, 28(1), Article 2239327. https://doi.org/10.1080/02673843.2023.2239327

Robinson, J., Thorn, P., McKay, S., Richards, H., Battersby-Coulter, R., Lamblin, M., Hemming, L., & La Sala, L. (2023). The steps that young people and suicide prevention professionals think the social media industry and policymakers should take to improve online safety: A nested cross-sectional study within a Delphi consensus approach. Frontiers in Child & Adolescent Psychiatry, 2, Article 1274263.

Robinson, J., Bailey, E., Hetrick, S., Paix, S., O’Donnell, M., Cox, G., Ftanou, M., & Skehan, J. (2017). Developing social media–based suicide prevention messages in partnership with young people: Exploratory study. JMIR Mental Health, 4(4), e78. https://doi.org/10.2196/mental.8383

Ronis, S., & Slaunwhite, A. (2019). Gender and geographic predictors of cyberbullying victimization, perpetration, and coping modalities among youth. Canadian Journal of School Psychology, 34(1), 3–21. https://doi.org/10.1177/0829573518759856

Sam, J., Wisener, K., Schuitemaker, N., & Jarvis-Selinger, S. A. (2018). Aboriginal youth experiences with cyberbullying: A qualitative analysis of Aboriginal e-mentoring BC. International Journal of Indigenous Health, 13(1), 5–19.

Sarah, S., Lee, D., White, E., Rezaei, S. H., & Tremblay, A. (Year not specified). Opportunities and threats of the metaverse for electronic commerce in Canada. Journal of Electronic Commerce Management, 1, 41–64. (DOI/URL not provided in original list)

Shade, L. R., & Chan, S. (2020). Digital privacy policy literacy: A framework for Canadian youth. In The Handbook of Media Education Research (pp. 327–338). 

Shade, L. R., & Shepherd, T. (2013). Viewing youth and mobile privacy through a digital policy literacy framework. First Monday, 18(12). https://doi.org/10.5210/fm.v18i12.5043

Simpson, B. (2014). Tracking children, constructing fear: GPS and the manufacture of family safety. Information & Communications Technology Law, 23(3), 273–285. https://doi.org/10.1080/13600834.2014.921495

Shapka, J. D., Onditi, H. Z., Collie, R. J., & Lapidot-Lefler, N. (2018). Cyberbullying and cybervictimization within a cross-cultural context: A study of Canadian and Tanzanian adolescents. Child Development, 89(1), 89–99. https://doi.org/10.1111/cdev.12817

Shanmugam, S., & Findlay, M. (2025). Empowering young digital citizens: The call for co-creation between youth and policymakers in regulating for online safety and privacy. In Mobile Media Use Among Children and Youth in Asia (pp. 159–184). Springer Netherlands. 

Smetana, J. G. (2008). “It’s 10 o’clock: Do you know where your children are?” Recent advances in understanding parental monitoring and adolescents’ information management. Child Development Perspectives, 2(1), 19–25.

Steeves, V., McAleese, S., & Brisson-Boivin, K. (2020). Young Canadians in a wireless world, phase IV: Talking to youth and parents about online resiliency. MediaSmarts. https://mediasmarts.ca

Symons, K., Ponnet, K., Emmery, K., Walrave, M., & Heirman, W. (2017). Parental knowledge of adolescents’ online content and contact risks. Journal of Youth and Adolescence, 46(2), 401–416. https://doi.org/10.1007/s10964-016-0479-9

Vaillancourt, T., Faris, R., & Mishna, F. (2017). Cyberbullying in children and youth: Implications for health and clinical practice. The Canadian Journal of Psychiatry, 62(6), 368–373. https://doi.org/10.1177/0706743716684867

Vaala, S. E., & Bleakley, A. (2015). Monitoring, mediating, and modeling: Parental influence on adolescent computer and Internet use in the United States. Journal of Children and Media, 9(1), 40–57. https://doi.org/10.1080/17482798.2014.933280

Wisniewski, P., Ghosh, A. K., Xu, H., Rosson, M. B., & Carroll, J. M. (2017, February). Parental control vs. teen self-regulation: Is there a middle ground for mobile online safety? In Proceedings of the 2017 ACM CSCW (pp. 51–69). ACM. https://doi.org/10.1145/2998181.2998245

Xiao, B., Parent, N., Bond, T., Sam, J., & Shapka, J. (2024). Developmental trajectories of cyber-aggression among early adolescents in Canada: The impact of aggression, gender, and time spent online. International Journal of Environmental Research and Public Health, 21(4). https://doi.org/10.3390/ijerph21040429

Explore More Wellbeing Dimensions