Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Youth Advocates Push for Algorithm Control for Safer Social Media

Over 93 percent of young Australians use social media daily, with 73 percent seeking mental health support on these platforms, according to data presented by ReachOut Youth Advocates at a Joint Committee on Social Media and Australian Society.
While advocates acknowledged social media’s benefits, they raised concerns about risks like cyberbullying, misinformation, and addictive algorithms.
Rather than focusing solely on age verification, they urged a multi-faceted strategy for safer digital experiences.
First, the youth advocates proposed integrating digital literacy into education, akin to driver training, to teach safe social media use.
Second, they call for greater user control over algorithms, allowing young Australians to shape their feeds and reduce exposure to harmful content.
Third, they recommend a “safety by design” approach, urging tech companies to embed safety features directly into platforms and bear legal responsibility for user well-being.
“We don’t think age verification is a simple short-term step because it’s just not where it needs to be,” explained Layla Wang, a youth representative from ReachOut Youth Advocates.
Representatives noted unresolved questions, including who would manage the verification process, whether it would be routed through government, third parties, or social media companies, and what documentation would be required.
Arjun Kapoor from the e-Safety Commissioner’s advisory council raised concerns that unclear plans generate anxiety among young users.
“If age verification upholds privacy, accuracy, and is implemented across all platforms, young people would likely be okay with it. But those are big ifs,” Kapoor said.
“One of the main things we need is relevant education, preferably taught by young people with lived experience,” Wang stated.
Public health-style campaigns were also recommended, with content promoting safe social media use in public spaces and digital platforms.
“This kind of foundational digital literacy is a key point every young Australian reported they wanted to see,” Kapoor said.
The proposal recommended that education should extend beyond young people to include parents, equipping them to guide rather than restrict young people online.
“Young people should have a say in what content they’re shown,” Kapoor noted, explaining that the constant flow of algorithmically-driven content can lead users to distressing or unhealthy content.
They recommended implementing verification processes to ensure reliable information, especially for vulnerable youth.
“Guidance from mental health experts, such as ReachOut, could be pivotal,” Wang suggested.
“Big tech must be held accountable for the negative impacts of their platforms,” he stated.
The advocates said tech companies should embed safety measures and work with regulators to create enforceable standards, preventing risks before they arise, and lessening the burden on users to navigate potentially harmful content.
“Digital literacy should be a coordinated effort from multiple stakeholders,” Kapoor explained.
Vijiyan noted that some platforms, like Meta’s Instagram for teens, already incorporate age-specific safety features, but advocates believe more needs to be done across the board.
The group envisions a landscape where companies, policymakers, and young people co-create policies, resulting in environments where the positive aspects of social media can be enjoyed safely.

en_USEnglish