Google moves to distort images it considers sexual or violent for all users

The world’s most-used search engine now blurs images that it classifies as explicit and has made it easier to remove sensitive photos published without permission

A user browses a Google Images page.Mario Bermudo

Google, the most widely used search engine in the world, now blocks content that it considers sexual or violent. If someone searches, for example, for “injury” or “sex” in Google images, they will find some results blurred by default, and will not be able to view them unless they click on “view image.” This is because SafeSearch, Google’s automated filtering system, has been defocusing sexual or violent results by default for all users since August.

The technology company, which turns 25 this month, is seeking to add an extra layer of security and privacy to its browser. “Our systems seek to evaluate whether content is shocking or sexually gratifying. SafeSearch is not designed to filter explicit content that has significant artistic, educational, historical, documentary, or scientific value,” Google says. The company has issued a guide to help website owners identify this type of content.

Google registers over 3.5 billion searches per day, according to the analysis company Semrush. Until the new guidelines were introduced, SafeSearch blurred searches with explicit results for minors between the ages of 14 and 18, but only if they were logged in to their Google accounts. Now, this is the case for all sexual and violent content, although minors in this age group remain protected by default. The tool also makes it possible to exclude sexual or violent content from the search engine, but this is not applied by default for all users.

Google’s new feature blurs images that it classifies as sensitive, but it does not block them. Paloma Llaneza, a lawyer and cybersecurity expert, points out that these measures mask content and can be easily removed. They “do not limit access,” she explained. Users can adjust SafeSearch settings and disable it at any time, unless a parent or the administrator of an educational network has blocked access to Google settings. Instagram and Facebook, both Meta social networks, also have similar features to blur images and provide an alert on explicit posts.

Parental control

As of February, parents can also use SafeSearch as a control measure. Part of Google’s Family Link app, it allows parents to set screen time limits and enforce content restrictions, among other features. “They are easy to use and allow you to understand what your children spend their time looking at on their devices and to manage privacy settings,” the company explained to EL PAÍS in March.

According to a report published by the Spanish National Technology and Society Observatory, 98% of children aged between 10 and 15 have been using the internet regularly since the pandemic. Misinformation, manipulation, psychological damage and addictions are some of the consequences of minors being exposed to inappropriate content, as laid out by the Spanish National Cybersecurity Institute (INCIBE). Cybersecurity expert Hugo Álvarez says that the effects can be very damaging. “They distort reality, normalizing violence and contributing to making young people think that the model of a sexual relationship is that of a porn movie, which almost always objectifies women,” he says.

Personal explicit images

Another of Google’s new developments to improve user privacy is the company’s updated policy on personal images. If someone appears on Google in an intimate situation, performing a sexual act, or naked, and without having given their consent, they can ask the company to remove the image. To do so, it is also a stipulation that the user is not profiting financially from that image. Another new feature is the ability to remove content that has been republished, despite having been deleted in the past. In addition, it is now similar to request this type of material be removed. These changes give the user more control over their image and digital footprint, although it should be noted that what is removed from Google does not disappear from the web or other search engines.

Results about you

Last year, the technology company launched the “Results about you” tool to allow users to remove their personal information, such as phone numbers, street addresses or email addresses, from search results. This feature also notifies the user when their personal data appears on Google. At the moment, the new system is only available in the United States for search results in English, but the company has stated it is “working to bring it to new languages and locations soon.”

Despite search engines benefiting from increasingly advanced and secure tools, the Spanish Data Protection Agency points out that they are not foolproof. The agency recommends accompanying these methods with “adequate education on the safe use of technology, the dangers of the internet, and the importance of minors being able to take their own security measures.”

Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition

More information

Archived In