New safeguards span Search, YouTube, Maps, and Play to protect minors online
Tech platforms are facing growing pressure worldwide to better protect young users, and Google is moving quickly to align with stricter digital safety expectations in Singapore.
New Age Assurance Measures Rolled Out
Google began rolling out new age verification and protection measures on Feb 2 across its products in Singapore, targeting users under 18. The safeguards apply to widely used services including Google Search, YouTube, Google Maps, Google Play, and Gemini, reinforcing the company’s commitment to safer online experiences for children and teenagers.
Using machine learning, Google estimates a user’s age based on existing account signals such as search behavior and the types of videos watched on YouTube. Accounts identified as belonging to minors automatically receive additional protections, without requiring users to manually declare their age.
Stronger Defaults for Child Safety
For users estimated to be under 18, Google will disable location timeline features in Google Maps, restrict access to adult-only apps on Google Play, and turn on SafeSearch filters by default. YouTube’s digital well-being tools will also be activated automatically, including reminders to take breaks and limits on repetitive viewing of certain content.
Users affected by these changes will be notified via email and through prompts while using Google products. Adults who are mistakenly identified as minors can verify their age by uploading a government-issued ID or submitting a selfie for verification, Google said.
Regulatory and Parental Concerns Drive Change
The rollout follows requirements set by Singapore’s Infocomm Media Development Authority and was first announced in October 2025, with implementation scheduled for the first quarter of 2026. The move also reflects growing concern among parents over children’s exposure to online risks such as inappropriate content, cyberbullying, and harassment.
A Ministry of Digital Development and Information survey conducted in February 2025 found strong parental concern about online safety, reinforcing the need for platform-level safeguards beyond existing parental controls.
Part of a Broader Global Shift
Google confirmed that its age assurance solutions will expand to Brazil and Australia, following a global approach to youth protection. Several countries have introduced or proposed stricter digital regulations in recent years.
Australia has barred teenagers under 16 from platforms such as Facebook, Instagram, and TikTok. China implemented a comprehensive “minor mode” in early 2024 after first introducing gaming restrictions in 2019. Germany requires parental consent for social media use among users aged 13 to 16, while Malaysia plans to enforce a similar ban for those under 16 later in 2026. France, Denmark, Britain, Norway, and Greece are also considering comparable measures.
Complementary Education Efforts
The new safeguards will complement Google’s existing family safety tools, including Family Link, supervised YouTube experiences, and its Be Internet Awesome digital literacy initiative. Separately, Google announced in October 2025 the fourth edition of its YouTube Creators for Impact programme in Singapore.
Six local creators will take part in the initiative, producing content to raise awareness about online harms such as cyberbullying and harassment, while guiding young users toward support resources.
Google’s latest measures reflect a broader shift toward stricter digital accountability, as governments and platforms respond to rising concerns about youth online safety. For both Indonesians and Singaporeans, the move highlights how technology governance is increasingly shaping cross-border standards for protecting young users in an interconnected digital world.
Sources: Straits Times (2026) , Malay Mail (2026)
Keywords: Google Age Verification, Online Safety For Children, Singapore Digital Regulation, Youth Internet Protection











