The UK Online Safety Act aims to keep everyone in the UK safe online.
Google is committed to that objective and to following the rules of the UK Online Safety Act. We're working hard to make sure our services are safe and secure.
How we Protect Users
At Google, we aim to balance delivering information with protecting all users, including children. We take this responsibility seriously.
The UK Online Safety Actimposes duties relating to illegal content and content harmful to children.
- Illegal contentincludes child sexual exploitation and abuse content, terrorist content, and other types of illegal content, like intimate image abuse.
- Content harmful to childrenincludes pornographic content, content that promotes suicide, self harm and eating disorders, bullying, abuse and hate content, and violence, dangerous stunts and harmful substances content.
Our Safety Centre outlines our approach to protecting you from illegal and harmful content.
We have developed and rely on various measures to prevent, detect, and respond to abuse, including illegal content and content harmful to children. In particular, our product policies set clear guidelines around what content is and is not allowed on our services.
We detect potentially violative content through a combination of human review and automated processes. We also rely on the Google community to help us identify such content by reporting it directly to us. More information on reporting content is set out below.
Where we learn about the presence of content that violates our policies, we deal with it swiftly by taking appropriate content moderation or enforcement actions.
Visit the Google Transparency Center , the Google Help Center , the Google Terms of Service , and our product policies to learn more about our efforts to protect users from violative content. You can also learn more about our approach to fighting child sexual abuse online across Google on our Protecting Children page . You can find YouTube’s policies and enforcement approach for child safety and violent extremism in its Transparency Report and the YouTube Help Center .
Our Google Families site outlines the tools available to help you protect your family online, choose the right parental controls, and empower your children to safely explore the online world.
How we use proactive technology to protect users
The Safety Centre explains our overarching approach to using proactive technology to prevent, detect and respond to content that violates our policies.
Our services employ a range of technologies to protect users from violative content. This may include:
- Content classifiers, which use artificial intelligence to help quickly flag potentially policy-violative content for removal or escalation to a human reviewer.
- Hash-matching technologies, which create a unique digital fingerprint for images and videos so those images and videos can be quickly compared with those of previously identified violative content.
These technologies are generally applied around the time users upload content and on an ongoing basis, and help us take appropriate actions like blocking or restricting content.
Visit the Google Transparency Center or the Google Help Center to learn more about how we use proactive technology on our products. You can learn about the proactive technologies we use to fight child sexual abuse online on our Protecting Children page . Our Help Centre also includes information on YouTube’s approach to age-restricted content and how Search’s use of SafeSearch helps protect users from explicit content.
Ensuring age appropriate access to our services
Some Google content and features are age restricted. Visit the Google Account Help Centre for more information about accessing age-restricted content and features.
How to Report Content
If you see content on Google services that you think is illegal, harmful to children or violates our policies, please tell us!Here’s how:
You can report content that violates our policies through our policy reporting tools. For more information, check out:
Other important resources include our Legal Help Center , which explains how to submit a legal removal request.
When you report content through one of our webforms, the request is routed to the team best positioned to help. Occasionally we might ask you for some more details.
Unless noted otherwise, once we have made a decision on a legal removal request, we will inform you of the outcome. We may also notify you when we make a decision about content you report through our policy reporting tools. In both cases, we may take the action requested, deny your request or take some alternative action that we deem appropriate. In limited circumstances, we may make a user aware that a complaint has been made against their content. In the case of a request by the government or a court, we may include the name of that government or court in our notice to the user. We typically make a decision within 10 business days, though some more complex cases may take longer.
If you find illegal or harmful content on a non-Google service you should contact the content provider directly. To report content on an App downloaded from Google Play you may also contact the app provider.
Appeal a decision
If your content or account has been restricted or removed, you may be able to appeal that decision in some cases. Appeals are typically resolved within 10 business days, though some more complex cases may take longer. When the appeal is decided we will inform you of our decision, which may include taking the action requested in your appeal, denying your appeal, or taking some alternative action that we deem appropriate.
Visit the Google Transparency Center or the Google Legal Help Center to learn more about Google policies and how to appeal a decision.
If your account has been incorrectly age restricted, you may be able to demonstrate that you meet Google’s minimum age requirements by using a government ID or credit card to verify your age. Visit the Google Account Help Center to learn more about that process.
Tools & Tips for Online Safety
Google offers many helpful resources to stay safe online. Here are some places to visit:
-
Google Safety Center : Learn the basics of staying safe online, including how to use technology in a way that’s right for your family.
-
Google Families : Get tools and advice to help families navigate the internet safely, particularly children and young adults.
-
YouTube Privacy and Safety Center : Get specific advice for protecting yourself on YouTube.
If you or someone you know has experienced serious harm online, there is support available to help, through organisations and resources recommended by Ofcom .
Google’s compliance with the Online Safety ActGoogle takes its obligations to comply with the Online Safety Actseriously. If you believe Google isn't following the rules of the UK Online Safety Act,or has used proactive technology other than in accordance with our terms of service, you can report this. Let us know through this webform how we can do better.
We will review your report and typically make a decision within 10 business days, though some more complex cases may take longer. We may also reach out for more information from you.

