There has been a large push towards introducing a ban on social media for under 16’s with over half of those adults surveyed saying they would support new legislation being put in place to prohibit under 16’s from using such platforms. This comes as a result of new research being released to coincide with the wide scale ban which was introduced in Australia and came in to effect from 10th December onwards.
A survey of two thousand adults from across the UK which was commissioned by Safety Tech firm Privately SA who specialise in age estimation technology is being deployed by at least three of the ten largest social media platforms in Australia. The findings illustrate there has been widespread concern about the amount of exposure children get to the online environment. All that being said, a large proportion of adults still acknowledge that children need to be online to learn, socialise and play. There needs to be a deeper understanding of age appropriate, not exclusionary experiences.
“This research shows UK parents are facing the same dilemma as families worldwide: they expect platforms to ensure far better protections for children but also want those children to benefit from being online,” said Deepak Tewari, CEO of Privately SA. “Blanket bans may be the starting point of the debate, but the real opportunity is for platforms to create safe, curated experiences for younger users without excluding them from digital life.”
Despite the large number of adults backing an outright ban, there are still others who support alternative safeguards being put in place. Around a third of those surveyed said platforms should create dedicated experiences for young users while 25% favour stronger guardrails and restrictions after an age check.
There should be more training put in place to enhance the awareness of online safety including a way to better verify a user’s age so they can avoid being exposed to inappropriate content.
However, support rises sharply when privacy protections are guaranteed. Three times as many people (39%) say they would accept facial age-estimation if it were carried out entirely on-device, with images never leaving the device. While 42% of adults express general comfort with age-estimation technology, the findings indicate a need for clearer communication about how such systems work and how data is kept private.
“What’s needed is privacy-first age assurance that lets platforms know whether a child is using their services without collecting or storing sensitive biometrics or ID information,” Tewari added. “On-device age estimation now makes this possible. It’s a rights-respecting solution that protects children and their privacy while supporting their digital inclusion.”
Privately SA has performed millions of on-device age checks in 2025 and is actively supporting the shift towards age-appropriate online experiences in 90 countries, including Australia and the UK. All checks take place locally on the user’s device, with no images or biometric data stored, uploaded, or shared.
Young people under 16 in Australia will not be able to access their social media, Twitch and YouTube accounts when a ban comes into force this week.
Last year the country’s government passed a new law which aims to stop the children from using some social media.
The ban will affect lots of different platforms, including Facebook, TikTok, Snapchat and YouTube – although it doesn’t apply to YouTube Kids.
All accounts that already exist will be shut down from 9 December, and from 10 December no one under 16 will be able to open an account.
The Online Safety Act sets out the standards for social media platforms such as Facebook, YouTube and TikTok.
Since July this year it has brought in new measures to try and protect under-18s from seeing harmful content on social media, including hate speech and violence.
It means that to see this content, people will have to prove their age, with platforms using secure methods like facial scans, photo ID and credit card checks to check the age of their users. This means it will be much harder for under 18s to accidentally or intentionally access harmful content.