Larry Magid: At Safer Internet Day, teens seek safer, smarter tech design
Last week I had the privilege of hosting about 100 high school and college students, lawmakers, educators, law enforcement and tech executives at ConnectSafely’s Safer Internet Day event in Sacramento. There were a couple of panels and a “fireside chat,” but it was mostly tableside conversations among stakeholders, including executives from Google, Meta, OpenAI, Snap, TikTok, Amazon, Roblox, Apple and Discord. The event, organized by ConnectSafely in partnership with Children Now and National PTA, focused on preserving technology’s benefits while reducing its risks through thoughtful, research-informed discussion.
Managing risk, not eliminating it
As ConnectSafely CEO, I opened the day by pointing out that risk exists in nearly every part of life. The goal is not to eliminate risk, but to manage it responsibly. Like driving a car or playing a sport, technology requires guardrails. When grounded in credible research, those guardrails expand opportunity rather than restrict it.
Ted Lempert, chair of Home – Children Now and a former California assemblymember representing parts of Silicon Valley, reminded the audience that students are not just future leaders but leaders today. He said policymakers must listen to their lived experience and noted that early optimism about the internet did not fully anticipate its risks. Appropriate regulation, he argued, can help society benefit from technology while addressing its dangers.
Heather Ippolito, president of the California State PTA, said parents are increasingly concerned about the mental health and safety impacts of social media and AI and stressed the need for practical guidance and education.
Legislator weighs-in
In a conversation with youth moderator Ava Smithing of the Young People’s Alliance, Assemblymember Rebecca Bauer-Kahan (D–Orinda) discussed the challenge of regulating rapidly evolving technologies when companies often have greater resources than government. Although companies are structured to maximize growth and engagement, she said, government has a responsibility to establish guardrails that protect the public, especially young users.
Digital safety, she argued, must focus on safer design rather than responding only after harm occurs. Referring to her social media warning label law signed in October 2025, she noted that describing social media as a “public health crisis” marks a significant shift in national conversation.
“I don’t think putting seatbelts in the car takes away your autonomy,” she said. “The safer we make it online, the more autonomy we can give kids.”
Fixing, not rejecting, technology
Throughout the day, ConnectSafely Education Director Kerry Gallagher led research roundtables where participants reviewed recent studies in light of their own experiences.
They discussed research on the increasingly blurred line between video games and gambling. Features such as loot boxes raised concerns because they introduce financial risk-taking before young people fully understand probabilities or consequences. Participants called for stronger age assurance and expanded financial literacy education.
Another report reflected skepticism from both youth and parents about sweeping social media bans. Students emphasized that social media plays an important role in maintaining friendships, organizing activities and finding community. Rather than blanket prohibitions, participants suggested improving in-app protections, fixing algorithms to reduce excessive use, strengthening parental tools and expanding digital literacy education.
These discussions reflected an emerging trend. Young people are not rejecting technology, but calling for safer, more transparent and more thoughtful design. A 2025 Pew Research Center study found that 74% of teens say social media makes them feel more connected to their friends, even though nearly half, 48%, say it has a mostly negative effect on people their age, up from 32% in 2022. But just 14% say it has a mostly negative effect on them personally.
On artificial intelligence, a June 2024 report from Harvard’s Center for Digital Thriving, based on a national survey of young people ages 14 to 22, found that 41% expect generative AI to have a mix of positive and negative impacts on their lives, while about one in five, 19%, expect the impact to be mostly negative.
AI: Innovation and youth concerns
Later, Gallagher moderated a panel focused on artificial intelligence where high school junior Aleeza Siddique described how deeply generative AI tools are already embedded in academic life, from brainstorming to drafting assignments. But she also raised concerns.
“I’m a teenage girl and one of my biggest concerns with AI is the way that people can use AI to harm me,” she said. She also worries that “AI is being used to take my autonomy.” She called for systems to be introduced at a sustainable pace and held to standards that make them worth adopting “by our own choice.”
During audience discussion, students raised commonly cited concerns about AI’s environmental impact, pointing to its water and energy use. They also warned about displacement of entry-level jobs, the growing threat of deepfakes and nonconsensual imagery, and questioned the fairness of training AI systems on human-created works without consent or compensation.
Tech industry panelists did not offer specific answers to concerns about environmental impact or training practices, but they described their safety efforts, including age-appropriate protections, parental controls, safeguards around sensitive topics and bans on sexualized imagery of minors. Regarding job displacement, Meta’s Allison Mishkin noted that railroads once disrupted horse-and-buggy drivers but urged the audience to consider “the new roles and the new opportunities we’re going to be creating.”
Concerns about technology replacing workers are not new. The term “Luddite” refers to early 19th century English textile workers who protested mechanized looms that threatened their jobs and wages.
How tech policy gets made
A policy panel moderated by LaShaun Francis of Children Now examined how technology policy is shaped and why youth engagement is essential.
Andrea Gil of the Youth Leadership Institute said young people must play a direct role in policymaking. “You can’t have a conversation about them without them,” she said.
Alison Merrilees, former chief counsel to the California Assembly Judiciary Committee, urged students to use their lived experience to influence lawmakers. “You guys are subject matter experts in how social media affects you,” she said, noting that decisions are often shaped before public hearings and that courts, as well as legislatures, play key roles in accountability.
Industry representatives agreed. Kristelle Lavallee of Discord said youth participation helps ensure “the policies are good and effective.” Wesley Hernandez of Snapchat added that youth “shouldn’t be just passive recipients of policies,” but active participants in shaping safer online environments.
I sit in on ConnectSafely’s Youth Advisory Council meetings and agree that young people have a lot to contribute, including insights that haven’t occurred to me, despite decades of experience in online safety.
From discussion to recommendations
At the end of the day, representatives from each table shared recommendations grounded in research and lived experience.
Education emerged as a top priority. Participants emphasized expanding digital, media and AI literacy for both students and parents. Young people stressed questioning AI outputs, recognizing manipulation and using technology thoughtfully rather than passively accepting what platforms present.
There was broad agreement that companies must build stronger safety protections into their products. Safety should be built in by design, with clear privacy protections, transparency and safeguards that prioritize youth wellbeing. Families should not be expected to navigate complex systems without support.
Lawmakers were encouraged to craft thoughtful regulations that reflect real experiences. Effective solutions, participants said, require collaboration among policymakers, industry, educators and youth themselves.
Above all, the day reinforced a simple truth: young people are not passive users of technology. Their lived experience provides insights essential to shaping safer and more effective digital environments.
Larry Magid is a tech journalist and internet safety activist. Contact him at larry@larrymagid.com.