In a bid to provide a safer online experience for kids, Google has announced that in the coming weeks, anyone under the age of 18 or their parent or guardian can request the removal of their images from Google Image results.
Removing an image from Search doesn’t remove it from the web but “we believe this change will help give young people more control of their images online,” Google said in a statement late on Tuesday.
Google will also no longer allow ad targeting of children based on their age, gender, or interests.
“We’ll be expanding safeguards to prevent age-sensitive ad categories from being shown to teens, and we will block ad targeting based on the age, gender or interests of people under 18,” the tech giant announced.
YouTube uploads from children will also gradually default to the most private setting.
“We’re going to change the default upload setting to the most private option available (on YouTube) for teens ages 13-17,” said the company.
Google will start rolling out these updates across its products globally over the coming months.
Google currently offers SafeSearch which helps filter out explicit results when enabled and is already on by default for all signed-in users under 13 who have accounts managed by Family Link.
“In the coming months, we’ll turn SafeSearch on for existing signed-in users under 18 and make this the default setting for teens setting up new accounts,” the company informed..
On Google, Assistant, the company will be introducing new default protections in coming in the coming months.
“For example, we will apply our SafeSearch technology to the web browser on smart displays”.
Location History is a Google account setting that helps make its products more useful. It’s already off by default for all accounts, and children with supervised accounts don’t have the option of turning Location History on.
“Taking this a step further, we’ll soon extend this to users under the age of 18 globally, meaning that Location History will remain off (without the option to turn it on),” said Google.
Google said it is also launching a new safety section that will let parents know which apps follow our Families policies.
Apps will be required to disclose how they use the data they collect in greater detail, making it easier for parents to decide if the app is right for their child before they download it.
“On YouTube, we’ll turn on take a break and bedtime reminders and turn off autoplay for users under 18. And, on YouTube Kids we’ll add an autoplay option and turn it off by default to empower parents to make the right choice for their families,” the company added.