Ending child exploitation means rethinking inclusion

By Anna Abelinde
MY SON talks to Alexa1. He consults ChatGPT on his phone, and has named and renamed the Meta AI according to themes he’s interested in. We did not teach him how it’s done. He is a 12-year-old digital native who could figure out online shopping faster than his father does. He is also on the autism spectrum. Sometimes, I think he uses AI to help him cope with and understand the digital world, where he’s largely teased for being a “noob2.”
As parents, we are worried. We have safeguards in place: Alexa is located in my workstation, so we can hear all the exchanges. We also have access to his phone and can see all his conversations. So far, they range from game design talk (he is learning how to code) to surviving games in Roblox. I am grateful that we are here to respond to his questions — Why is it not acceptable to fart in public when it is a normal human experience? Why do people tease him for (still) wanting to play with plushies?
But through these interactions, I’ve learned how children — and even adults who feel unseen — would turn to AI to process their feelings and learn social cues in order to belong. At the end of the day, it is fundamentally human to want to be “seen.” AI affords that, without the feeling of being judged. And that is where the line between connection and exploitation begins to blur; how many children — especially those who feel unseen, unheard, or different — turn to digital spaces to find belonging.
THE CHILD LABOR PARADOX
Last month, our team was out of town to monitor the implementation of one of our child labor prevention projects.
According to the Philippine Statistics Authority, child labor is on a decline — from 4.7% in 2022 to 2.7% in 2024. Ironically though, online sexual abuse and exploitation of children (OSAEC) are increasing, with recent data from the Commission on Human Rights indicating that as many as 2.7 million children have become victims of OSAEC.
But the thing is, OSAEC is one of the worst forms of child labor — and while traditional, rural-based child labor is declining, urbanization is reshaping the landscape of exploitation: one that is online, encrypted, and therefore less visible.
These nuances have not been properly addressed, as government and civil society interventions remain largely compartmentalized and have been unable to address systemic issues that contribute to the perpetuation of all forms of child exploitation.
THE DIGITAL WORLD WE BUILD
While children and young people can thrive in digital spaces, these same spaces can also be dangerous. And the rapidly evolving, volatile, uncertain, complex, and ambiguous world asks us: what kind of digital world are we building for our children? Are we creating spaces that welcome their curiosity, respect their diversity, and help them form meaningful connections? Or are we too slow to adapt, therefore putting them in spaces where their innocence is commodified, and their vulnerability exploited?
Because, whether online or offline, exclusion — whether by disability, stigma, or poverty — forces children to seek connection elsewhere, and the online world can be just as exploitative as the offline world, because both are made up of the same humans and same systems.
I guess my point is, being more than a buzzword, inclusion is protection.
When children with disabilities are embraced in schools instead of isolated, they are less likely to disappear into digital loneliness. When communities build safe play spaces and strong relationships, children are less likely to seek false safety online. When digital platforms are designed with empathy and accountability, the internet becomes less of a hunting ground. When opportunities for decent work are present for their parents, then children will become less pressured to take part in the family’s economic burden.
Ending exploitation, whether physical or digital, requires us to ensure that no child feels invisible. And we can do that by acknowledging that our child protection systems were designed for spaces that no longer exist. We need agile and accountable systems to detect red flags online, and to implement existing laws in the spirit in which they were intended. We need to understand barriers to reporting cases, and equip our law enforcers, teachers and social workers with, the right knowledge and attitudes in addressing cases.
We must also push tech companies to take responsibility, to design safer platforms, invest in AI for child protection, and ensure their profit models do not depend on human suffering.
This year’s National Children’s Month theme, “OSAEC-CSAEM Wakasan: Kaligtasan at Karapatan ng Bata, Ipaglaban,” also calls for reflection on our child protection systems, or as the Gen Alpha calls it: Are we content with being “six-seven3” when we could just be “lit4?”
1Alexa or Amazon Alexa is a virtual assistant that responds conversationally to questions and provides voice-controlled support for tasks such as playing music and managing home devices like lights and thermostats.
2Noob is slang for a beginner or someone inexperienced, often used playfully or teasingly in gaming and online contexts.
3Six-seven is Gen Alpha slang for “mediocre.”
4 Lit is Gen Alpha slang for “exciting.”
Anna Abelinde is the country director of Terre des Hommes Netherlands (TdH NL) in the Philippines, an international non-government organization that works with local partners to bridge critical gaps in child protection systems to prevent exploitation of children and young people. TdH NL envisions a future where children, in all their diversity, shape programs and policies, and grow up safe, empowered, and free to create the future they deserve.












