
Google has released an explainer video that explains how Google handles reviews left for local businesses on Google Maps. The article describes the numerous steps and actions Google takes to review and publish user-generated reviews in a matter of seconds.
Google revealed five steps it takes to ensure that Google Maps reviews are accurate and useful.
Step 1: Strict Content Policies
A well-defined content policy serves as the foundation of Google’s approach to moderating reviews left on Google Maps.
Every website that accepts user-generated content must have a clear policy outlining what is and is not acceptable. This assists users in understanding the limits and also informs moderators when to intervene.
“We’ve created strict content policies to make sure reviews are based on real-world experiences and to keep irrelevant and offensive comments off of Google Business Profiles.”
Important information about Google Maps Review Content Policy
Google’s content policy outlines the outcome they hope to achieve:
“Contributions must be based on real experiences and information.”
Google’s content policy lists six types of prohibited activity.
Examples of Map Review Policy Violations in Review Content:
- Deliberately fake content
- Copied or stolen photos
- Off-topic reviews
- Defamatory language
- Personal attacks
- Unnecessary or incorrect content
Step 2: Google’s Algorithm Is Integrated With Content Policy
The next step Google takes to protect the integrity of Google Maps Reviews is to incorporate the content policy into Google’s algorithms by using it as training data for the algorithm and its human moderators.
Google explains:
“Once a policy is written, it’s turned into training material — both for our operators and machine learning algorithms — to help our teams catch policy-violating content and ultimately keep Google reviews helpful and authentic.”
Step 3: Google immediately moderates the reviews
Google states that as soon as a review is posted, it is routed to their moderation systems for review.
Google employs both human and machine review systems. Within seconds, Google’s algorithms can process a review and approve it for publication.
Rather than relying on humans to complete tasks, Google has traditionally preferred to scale its systems with algorithms.
To determine whether a review is fake, the algorithm considers a variety of factors.
Among the review factors mentioned by Google are:
- Is the content obscene?
- Is the content inappropriate?
- Is the account that left the review engaging in any suspicious behavior?
- Is the increase in reviews due to news or social media attention, which encourages fake reviews?
Google explains the operation of its automated system:
“As soon as someone posts a review, we send it to our moderation system to make sure the review doesn’t violate any of our policies.
…Given the volume of reviews we regularly receive, we’ve found that we need both the nuanced understanding that humans offer and the scale that machines provide to help us moderate contributed content.”
Step 4: Google Promotes Community Moderating
Google has stated that it encourages businesses and the general public to report fake reviews.
This is a common way of moderating user-generated content (UGC).
This method is sometimes referred to as “Report-a-Post.” Report-a-Post is fantastic because it makes users feel like they are a part of a community and crowd-sources the review function, allowing users and businesses to apply their unique perspective to catch bad reviews that might slip past a moderator or an algorithm.
Step 5: Google is proactive in detecting fake reviews
Google revealed an intriguing fact: it is proactive in anticipating events that could lead to abusive reviews. Google conducts increased monitoring of reviews of businesses in the vicinity of those events to ensure that only authentic and useful reviews are published.
Google shared:
“For instance, when there’s an upcoming event with a significant following — such as an election — we implement elevated protections to the places associated with the event and other nearby businesses that people might look for on Maps.”
Google Maps Reviews Using Machine Learning and Human Moderation
Google’s approach to moderating user-generated content is consistent with a long-standing approach pioneered on forums and blogs, including the use of automated systems to deal with users and events that may lead to more abusive content.
This article is useful because the steps Google takes can be used as inspiration and a template for developing a strategy for moderating user-generated content on any website or platform that accepts user-generated content.
Citations
Read Google’s Blog Post
How reviews on Google Maps work
Read Google’s Map Reviews Policy
Maps User Contributed Content Policy





One Comment