Back to all articles
Josh withers P4 Re6 G Izp9 Q unsplash 1

Young people navigating 'The Algorithm'


Josh withers P4 Re6 G Izp9 Q unsplash 1

When 15-year-old Ayush logs onto social media, he sees a constant stream of content to watch, like, share, comment. But what he’s really curious about is how social media platforms actually decide what he sees. He’s not alone. Millions of young Australians are interacting in online spaces shaped by algorithms, and while many report positive experiences, the complexities of how these systems work raise important questions about agency, wellbeing, and safety.

At PROJECT ROCKIT, we’re deeply committed to elevating the voices of young people. That’s why we mobilised our National Youth Collective – a group of 32 paid lived experience experts aged 12 to 20 – to launch the Shaping Our Feeds project. Through this initiative, we surveyed over 1,000 young people across the country to explore their views on social media algorithms, what works, and what they’d like to see change.

The responses paint a complex picture of life online. Young people broadly recognised the value that social media brings into their lives, with 86% reporting positive impacts such as new friendships, skill development, and access to mental health resources. One young person shared, “I think it’s a wonderful way to connect to other people… it can make people feel less alone.” Another respondent reflected, “Social media has allowed me to develop my individuality,” highlighting the opportunities for identity-affirmation and self-expression.

“Social media has allowed me to develop my individuality.”


Yet, these positive experiences sit alongside significant concerns. 52% of young people said they had encountered negative impacts on social media, ranging from exposure to disturbing or inappropriate content to feelings of isolation and unhealthy comparisons. One person described the frustration of feeling trapped in endless scrolling, noting, “Well, I don't even enjoy what I'm watching, but I watch it anyway, as it pulls me in and takes up my time.”

For many, social media is a tool that cuts both ways – sometimes offering meaningful connections and at other times amplifying issues like body image struggles or bullying. One young participant explained, “I had a really unhealthy body image when seeing many posts about makeup, exercise, and dieting, and thought I wasn’t doing enough.” Another commented on the harm of discriminatory content: “I have seen things that are hateful which have affected my mental health
– specifically hate towards minority groups, targeting them for things out
of their control. It makes me feel unloved and insecure about who I am.”

“I had a really unhealthy body image when seeing many posts about makeup, exercise and dieting.”


Despite these challenges, young people are not passive users. They want more agency in shaping their feeds, with 56% expressing a desire to be able to reset their algorithms and start fresh. Many asked for more transparency and control, with one young person making the call out, “I would like more control over the content I see and better transparency about how it's selected.” They also noted the importance of being able to filter out harmful content, while appreciating the efforts platforms are making to proactively detect and remove harmful content through machine learning. As another respondent observed, “They help me make sure I don’t see inappropriate or offensive things online.”

Young people are eager to better understand the technologies that shape their experiences, with 45% having actively searched for information about how algorithms work. Yet, there remains a gap between perceived and actual knowledge. As National Youth Collective member, Michael, observed, “Most young people feel confident in their knowledge of how algorithms work, but it’s likely that they are overconfident in their knowledge.”

Alongside the report, we are rolling out a new suite of educational videos aimed at enhancing young people's literacy and capacity to understand and navigate social media algorithms with greater confidence. Created by and for young people, these new resources map directly onto the report's findings to target the areas where young people expressed a need for greater understanding and control.


What’s clear from the Shaping Our Feeds report is that young people are not just passive consumers of social media. They actively navigate a blend of positive and negative experiences every day. Their online lives are rich with meaningful connections yet woven with challenges that differ significantly from person to person and across contexts. Focusing solely on online safety risks oversimplifying these realities and can inadvertently limit their right to social participation, as well as access to valuable resources and support.

The young people involved in this project are calling for tools and opportunities to take control of their online experiences. They want to shape their own feeds – not have them dictated by machine learning or switched off altogether. Yet, amid ongoing conversations about young people and social media, whose voices are starkly absent? Young people are not merely seeking safety, they are seeking richer experiences, greater agency, healthy content, and authentic connections.

The directive to big tech and government is clear: we must actively engage young people in the design of solutions to the challenges they face. Only by building their capacity to play a leading role can we truly support their safety and wellbeing while preserving their participation rights. It’s time to move beyond protection and towards collaboration in order to hold digital platforms accountable for providing healthier, richer, and safer experiences for young people.

For more insights and recommendations, check out the full Shaping Our Feeds report.

This project has been made possible with support from Meta, and it is an independent initiative guided by the voices of our National Youth Collective, with no editorial oversight from Meta itself. We hope that senior leaders at social media companies will take note of our findings and implement meaningful changes to foster healthier online experiences for young people. PROJECT ROCKIT presented these findings to leaders at the Meta APAC Youth AI Literacy & Safety Summit in late October, 2024.

Image: Lucy Thomas (PROJECT ROCKIT CEO) presents at the Meta AI Literacy Summit
Image: Lucy Thomas (PROJECT ROCKIT CEO) presents at the Meta AI Literacy Summit

Sign up to our mailing list

Receive updates on programs, progress and impact.

(Required)