SEO

11 Methods for Creating a Google Algorithm Update Resistant SEO Strategy

How can you get off the roller coaster while still maintaining high rankings in the face of Google updates? Include these 11 suggestions in your SEO strategy.

Google’s algorithm updates can give the impression that the search engine is punishing publishers for unknown reasons.

Rankings for websites are never guaranteed.

Nonetheless, by following these 11 tips, you can improve the stability of your rankings and develop a more Google algorithm update-resistant SEO strategy.

1. User Intent Is Just the Start

User intent is important, but it is only a starting point for creating content that generates revenue daily, regardless of algorithms.

User intent is one of several factors to consider when developing algorithm-resistant web pages.

It’s like the beans in your burrito or the cheese on your pizza. It enhances the flavor of your content.

The importance of identifying user intent stems from the fact that it puts you in the mindset of prioritizing the user (rather than keywords).

And it is here that great SEO strategies begin.

Read How to Run an Outreach Link Building Campaign?

2. Make visitors to your website the center of your universe

Writing content in a way that mirrors the visitor’s need to see things through the lens of how they are affected is one psychological writing trick that works fantastically for creating web pages.

Site visitors only interact with pages that are relevant to them.

I know a smart pay-per-click marketer who creates landing pages that are so customized to each visitor that his websites are practically mirrors.

Among the many things this person did was designed landing pages that detected whether a site visitor was using an Android or an Apple device. The webpage would then display an “Apple Friendly” or “Android Friendly” icon.

He did this because A/B testing revealed that those icons on the webpage converted at a slightly higher rate for his audience. Isn’t that a silly thing to say?

Readers are concerned with how a website topic will affect them. When a site visitor accesses a webpage, the world ceases to revolve around the sun. Even if they are in an eCommerce store, it revolves around the site visitor.

Do Apple’s customers care why the company developed its CPU chip?

No. They only want to hear how it will outperform their expectations and turn them all into heroes.

Zappos grew in popularity because they made it simple to return shoes. Their customer service was excellent because they treated their customers as if they were selfish individuals concerned only with their own needs.

What users want to see more and more is how your site, service, product, or information affects their lives.

3. The term “authoritative” refers to more than just links

Google has no authority metric, even though it claims to want to rank authoritative content.

Language plays a role in determining whether something is authoritative.

For example, Google appeared to have begun incorporating language-related features into search results pages sometime after Google Hummingbird (SERPs).

I noticed that Google began ranking university research pages for a two-word phrase that used to rank for software companies.

The commercial webpages all had links to their websites, far more than the university research webpages.

Except for one, all commercial pages were removed from the first two pages of the SERPs. The word “research” appeared in the content of that commercial webpage.

The.edu university webpages were not ranking because of.edu magic or links.

For a short time, Google associated this two-word phrase with a specific topic (research) and chose to rank only pages that featured research, which at the time primarily consisted of university webpages.

For that two-word keyword phrase, Google now primarily ranks informational web pages. In other words, for this two-word keyword phrase, informational content is authoritative.

The traditional authority metric links. Sites with a greater number of links are more authoritative.

However, language can also be used to convey authority. This is demonstrated in search results, where the words used have a greater influence on what is ranked than the influence of links.

Links used to be the single most important factor in propelling web pages to the top of the SERPs. That is no longer true.

Now, it’s as if natural language processing decides which race a webpage will run in, and depending on the user intent and what qualifies as authoritative for that type of content, that race may be on page two of the search results.

For some queries, informational content will race on Track 1 (the top half of the SERPs), while pages with commercial intent may qualify for Track 2. (analogous to the bottom half of those SERPs).

Regardless of how many links that commercial page receives, its content will never be authoritative enough to rank first for that keyword phrase topic.

To summarize, I’d like to introduce the concept that content can be authoritative in a topic-related way.

Users indicate to Google what type of content is relevant to them (via their choices and activities). Based solely on the content, content can be authoritative for what users are looking for or not authoritative, regardless of links.

4. Providing Comprehensive Content vs. Treating Visitors Like They’re Five Years Old

When people think of authority, they often think of being all-encompassing, larger, and on an intermediate level.

Continue reading because authority and authoritativeness could be about understanding what users want and giving it to them in the form that they want it.

It can also take the form of a baby bottle. Sometimes being authoritative means explaining things to site visitors as if they were 5-year-olds.

For eCommerce, authoritative could be a webpage that assists the user in deciding while not assuming that they understand all of the jargon.

Authoritarian content can take many forms.

A site visitor, for example, might have the user intent, “I’m dumb, what does XYZ mean?” In that case, authoritative content is content that is at the “I have no idea” level of a beginner.

This may be especially true for sites that review items that contain technical jargon.

A site doing a round-up summary of the top ten budget products may choose to focus on a quick and easy-to-understand summary that does not need to explain the jargon.

An explainer in a sidebar or tool-tips to explain the jargon can be included on the full review webpage.

I’m not saying people are stupid. What I mean is that it is sometimes best to write content as if your site visitors lack intelligence because that is the level at which many people may be operating for a given topic.

Given that there is an almost infinite supply of people who require things to be carefully explained, it can be a winning strategy for long-term ranking success.

5. Use the Search Results as a Guide… To a certain degree

In general, it’s best to follow the search results. There is some value in attempting to understand why Google ranks certain web pages.

However, just because you understand why a page is ranking does not mean you should start copying it.

One method for researching search engine results pages is to map the keywords and intents to the top ten ranked web pages, particularly the top three. Those are the most crucial.

This is an area in which current SEO practices can be improved.

Top Two Strategies That Could Be Better

Imitate High-Ranked Websites?

The general rule is to mimic or copy what the top-ranked sites are already doing, but to “do it better.”

The idea is that if the top-ranked sites share XYZ factors, it is assumed that those XYZ factors are what Google looks for on a webpage to rank it for a given keyword phrase.

Isn’t that common sense?

The term “outlier” comes from the field of statistics. When two or more web pages share certain characteristics, those pages are said to be normal. Outliers are websites that are not like the rest.

If your webpage does not have the same word count, keywords, phrases, and topics as the top-ranked sites, it is considered a statistical outlier to analyze search results.

Changes will be recommended by search analysis software so that the outlier page conforms more closely to what is currently ranked.

The problem with this approach is the underlying assumption that Google will rank content based on the qualities that already exist on web pages ranked in search results.

That’s a huge assumption with no logical foundation.

Of course, a statistical outlier can outrank the top three ranked pages.

For example, I’ve ranked pages higher than existing pages by doing things like explaining more, making it easier to understand, and including diagrams and original photos – all while using keywords that the competition wasn’t using.

Not only did my pages end up with a different keyword mix, but the content in general was designed to better answer the question implicit in the search query.

That is the distinction between concentrating on keywords and concentrating on the search query.

In my opinion, understanding the search query is far more important than analyzing web pages to identify Factors XYZ that may or may not have anything to do with why those pages are ranking.

Updates over the last several years have been focused on better understanding what search queries mean and what pages users want to see, among other things.

So, doesn’t it make sense to focus on better understanding what search queries mean and responding to them in a way that people (and search engines) can understand?

Analyzing search results can help you figure out what the user’s intent is.

The next step should be to take that information and apply your best efforts to fulfilling the need implied by the user intent.

How Can I Make Bigger and Better Pages?

The second strategy is to create content that is better or more than that of top-ranked competitors.

They’re both about beating the competition by imitating their content but making it (vaguely) “better,” longer, or more up-to-date.

So, if they have 2,000 words of content, you will publish 3,000 words.

And if they have a top ten list, outrank them by having a top 100 list.

The concept is similar to a comedy set-piece in which an insane man communicates his strategy for outselling a popular 8-Minute Abs video by creating a video called 7-Minute Abs.

Simply because the content is long or contains more of what the competitor has does not imply that it is better or inherently easier to rank or obtain links to.

It must still be useful.

So, instead of focusing on vague recommendations such as being ten times better or more concrete but completely random recommendations such as being more than your competitor, how about simply being useful?

Return to Search Results as a Reference

Mining the search results to learn why Google ranks web pages will not yield useful information.

What you may be able to comprehend is the user intent and what I refer to as the Latent Question, which is inherent in every search query.

You can read more about it here: Analysis of Search Results: The Latent Question

6. Make Your Promotional Strategy Diverse

It is never a good idea to promote a website in only one way. Anything that spreads the word is a good thing. Produce podcasts, write a book, be interviewed on YouTube, appear on television, and so on.

Be as visible as possible so that how the site is promoted and how people learn about the site comes from a variety of sources.

This will aid in the development of a solid foundation for the site that can withstand changes in the algorithm.

For example, if word-of-mouth signals become more important, a site that has focused on word-of-mouth promotion will be prepared.

7. Make an effort to prevent link rot.

Link Rot occurs when links to a webpage lose links, reducing the amount of influence they confer on your website.

Read more about Link Rotation – What It Is and How It Affects Rankings.

Maintaining a link acquisition project, even if it’s a small one, is the solution to link rot. This will help to counteract the natural process by which links lose value.

8. Promotion of a Website

Websites must be publicized. A lack of promotion can cause a website’s reach to dwindle over time, leaving it unable to connect with the people who need to see the content.

According to Google’s John Mueller:

“We use a ton of different factors when it comes to crawling, indexing and ranking.

So it’s really hard to say like, if I did this how would my site rank compared to when I do this. …those kinds of comparisons are kind of futile in general.

In practice though, when you’re building a website and you want to get it out there and you want to have people kind of go to the website and recognize what wonderful work that you’ve put in there, then promoting that appropriately definitely makes sense.

And that’s something you don’t have to do… by dropping links in different places.”

As Mueller stated, it is not simply a matter of adding links to web pages. It’s simply letting people know the site is available.

It can be done through social media, participation in Facebook Groups and forums, local promotions, cross-promotions with other businesses, and a variety of other methods.

Some refer to this as brand building, in which the name of a company becomes almost synonymous with a specific type of product or website.

9. Link Variety

One of the reasons some sites bounce up and down in the search results is a flaw, which can sometimes be attributed to a lack of diversity in inbound links.

Anecdotal evidence suggests that sites that appear near the top of search results have a variety of links from various types of websites.

With the advent of natural language processing (NLP) technologies that can place a greater emphasis on content over links, this may no longer be the case.

However, links – particularly the right kinds of links – continue to play a role.

Setting aside the impact of NLP and focusing solely on links, cultivating a diverse set of inbound links may help a site withstand changes in Google’s link algorithms.

There are various types of links.

  • Links to useful resources
  • Articles contain links.
  • Bloggers provide recommendation links.
  • Articles with links

It no longer matters if a link is prevented from being followed by a search engine by using the nofollow link attribute.

Google might decide to follow those links. Some links are also useful in increasing a site’s popularity and awareness.

10. Ranking Indicators And E-A-T

Google uses a variety of signals to determine where a site should be ranked. Google will even disregard links or spammy content to rank a site that is performing well in other areas.

According to Google’s John Mueller:

“A lot of times what will happen is also that our algorithms will recognize these kind of bad states and try to ignore them.

So we do that specifically with regards to links… where if we can recognize that they’re doing something really weird with links… then we can kind of ignore that and just focus on the good parts where we have reasonable signals that we can use for ranking.”

…we try to look at the bigger picture when it comes to search, to try to understand the relevance a little bit better.”

So a site can have qualities that can overcome spammy links or SEO. We can only speculate on what these qualities are.

But I suspect it has something to do with how expert, authoritative, and trustworthy the content and webpage are in and of themselves.

11: Keep an eye out for changes

It is critical to be aware of all announced changes to Google’s algorithm to build a site that is resistant to algorithm changes. Changes such as passage ranking, BERT, and how Google ranks reviews must all be kept up with.

Try to figure out what the subtext of the algorithm change is, but do so by asking yourself, “How does this algorithm change help users?”

Don’t speculate on motives when interpreting what an algorithm means. That is always a bad idea and never aids in the development of an actionable ranking strategy.

Instead, consider algorithm changes in terms of how they might benefit a user.

For example, the passage ranking changes could be interpreted as a way to surface more content for users because it previously struggled with long pages with subpar SEO.

Recent changes in how Google ranks reviews could be interpreted as Google broadening the range of sites that must be trustworthy and accurate.

This means that it may be beneficial to concentrate on qualities such as trustworthiness and accuracy. It could also mean being more genuine.

By concentrating on the steps outlined above, you can create a high-quality site that can withstand changes to Google’s algorithm.

Need help with our free SEO tools? Try our free Plagiarism CheckerArticle RewriterWord Counter.

Learn more from SEO and read Creating a Keyword Strategy for Comparability Content.

Related Articles

Leave a Reply

Your email address will not be published.

Back to top button