Youth Digital Wellness Initiative

The internet wasn't
designed with children
in mind.

Safe Scroll is a policy advocacy and digital wellness initiative working to create safer, more accountable online environments for young people — through evidence-informed advocacy, systemic reform, and community education.

The Evidence

Why this matters

Excessive and harmful screen time isn't just a parenting concern — it's a systemic issue with measurable consequences for young people's health, development, and safety.

Attention & Focus

Constant notifications, autoplay, and algorithmically optimized feeds train the brain for rapid stimulus switching — making sustained attention increasingly difficult for young people still in critical developmental periods.

Sleep Disruption

Late-night screen use suppresses melatonin production and delays sleep onset. Poor sleep in adolescents is linked to lower academic performance, mental health challenges, and long-term health consequences.

Emotional Regulation

Social comparison, cyberbullying, and exposure to emotionally charged content affect self-esteem and mood regulation. Adolescents are especially vulnerable during the identity formation years.

Information Vulnerability

Young people are disproportionately exposed to misinformation, extremist content, and manipulative advertising through algorithmic amplification — without the media literacy to critically evaluate what they see.

Addictive Design

Variable reward loops, infinite scroll, streak mechanics, and engagement-maximizing algorithms are borrowed from behavioral psychology and gambling research — and applied to products used by children with few guardrails.

Weak Safeguards

Despite acknowledging the risks, many platforms apply minimal safeguards for minors. Age gates are easily bypassed, default settings favor engagement over safety, and enforcement of child-protection policies is inconsistent.

Structural Analysis

The problem landscape

Harmful digital experiences for young people don't arise from a single cause. They stem from interconnected structural failures across technology design, regulation, and education.

Root Causes

Engagement-first design

Platform revenue models reward time-on-site above all else, creating strong financial incentives against user wellbeing.

Regulatory vacuum

Child-specific digital protections are sparse, outdated, or inadequately enforced in most jurisdictions.

Digital literacy deficit

Young people, parents, and educators often lack the context to understand how platforms influence behavior and thinking.

Opaque data collection

Platforms collect behavioral data from minors at scale, using it to refine engagement systems with little transparency or oversight.

Immediate Harms

Sleep disruption

Late-night device use and stimulating content interfere with sleep cycles critical to adolescent brain development.

Attention fragmentation

Constant interruptions and rapid-fire content reduce the capacity for deep focus and extended learning.

Social comparison & anxiety

Curated, idealized social media content fosters unhealthy comparison and contributes to anxiety and low self-worth.

Exposure to harmful content

Algorithmic amplification can expose young users to extreme, violent, or age-inappropriate content without adequate filtering.

Long-Term Risks

Chronic mental health challenges

Prolonged exposure to harmful digital environments is associated with elevated rates of anxiety, depression, and social withdrawal.

Erosion of civic trust

A generation raised on algorithmically curated misinformation may enter adulthood with fractured relationships to facts and institutions.

Developmental interference

Screen time displacing physical activity, in-person relationships, and unstructured play may impair key developmental milestones.

Normalized surveillance

Early exposure to pervasive data collection normalizes privacy erosion, with long-term implications for individual autonomy.

Regulatory Context

Policy and legal gaps

Existing frameworks were written for a different internet. The platforms children use today have outpaced the protections that govern them.

What exists today

The Children's Online Privacy Protection Act (COPPA) in the United States and the UK Age Appropriate Design Code represent meaningful steps toward protecting children online. The EU's General Data Protection Regulation (GDPR) includes some child-specific provisions, and several countries have introduced age-appropriate design requirements.

These frameworks establish that children deserve distinct protections online — but they fall short in scope, enforcement, and technological specificity.

COPPA (US) GDPR (EU) UK Age Appropriate Design Code State-level legislation

Where the gaps are

Enforcement

Even well-designed laws fail when agencies lack resources or jurisdiction to hold large platforms accountable. COPPA violations are common, yet enforcement actions are rare relative to scale.

Algorithmic transparency

No major jurisdiction currently requires platforms to disclose how their recommendation systems operate when serving content to minors — leaving regulators, parents, and researchers in the dark.

Age verification

Platforms largely rely on self-reported birthdates for age gating — a system trivially bypassed by any child. Privacy-preserving age verification solutions exist, but have not been broadly mandated.

Platform design accountability

Laws focus primarily on data collection, not product design. Addictive features — infinite scroll, autoplay, variable reward notifications — remain largely unregulated despite documented harms.

A Path Forward

Safe Scroll solutions

Protecting young people in digital spaces requires action across multiple fronts — from platform design to policy reform to community education.

Age-Appropriate Design

Platforms serving young users should be required to implement privacy-preserving age verification and to default to the most protective settings for verified minor accounts — with features that match developmental stage.

Transparent Algorithms

Content recommendation systems that serve minors should be subject to independent audit, public disclosure requirements, and meaningful oversight — so that the systems shaping what young people see are not operating in a black box.

Parental Tools & Better Defaults

Families deserve meaningful, usable tools — not buried settings menus. Platforms should default to safer options for minors and provide parents with clear, accessible dashboards to understand and manage their child's digital experience.

Digital Literacy Education

Young people should understand how the platforms they use work — including how their data is collected, why certain content is recommended to them, and what behavioral design techniques are designed to influence their behavior.

School & Community Involvement

Educators, school counselors, and community leaders are on the front lines of this issue. Safe Scroll supports school-based programs that equip students and teachers with practical frameworks for healthier digital habits.

Platform Accountability

Technology companies must face meaningful consequences for product designs that foreseeably harm children. Legislation should move beyond data privacy to address features explicitly designed to maximize engagement at the expense of user wellbeing.

Our Audience

Who this is for

Safer digital environments for young people are a shared responsibility. Safe Scroll speaks to everyone with a stake in how the next generation experiences technology.

Parents

You don't need to be a technologist to advocate for your child's digital safety. Safe Scroll translates complex platform behavior into clear, actionable understanding — and supports systemic changes that no individual parent can achieve alone.

Families & Caregivers

Educators

Teachers and school counselors see the effects of excessive screen time in classrooms every day. Safe Scroll supports educators with frameworks for building digital literacy into school culture and advocating for policy change at the district level.

Schools & Counselors

Students

Young people aren't the problem — they're navigating environments not designed with their interests in mind. Safe Scroll is for students who want to understand the systems shaping their digital lives and engage in meaningful advocacy for change.

Youth & Young Adults

Policymakers

Legislators, regulators, and public officials have the authority to close the gaps in digital child protection. Safe Scroll provides a clear, evidence-grounded case for targeted reforms that go beyond data privacy to address platform design accountability.

Government & Regulators

Technology Companies

Safe Scroll is not anti-technology — it is pro-accountability. Technology companies that genuinely invest in child safety and age-appropriate design are part of the solution. We support and recognize those who lead, and advocate for standards that raise the floor industry-wide.

Industry & Developers
Get Involved

Healthier digital environments
start with collective action.

Safe Scroll is a call to action for everyone who believes that young people deserve better from the digital world. Whether you're a parent, educator, advocate, or concerned citizen — there's a meaningful role for you here.

Learn the Issue

Support Policy Reform

Advocate for stronger digital child protection legislation in your jurisdiction — from age-appropriate design mandates to algorithmic transparency requirements.

Share the Mission

Help expand awareness by sharing Safe Scroll with parents, teachers, school administrators, and anyone concerned about youth digital wellbeing.

Bring It to Schools

Safe Scroll resources and frameworks can be adapted for school and community settings. Reach out to explore how we can support digital wellness education in your institution.