Google algorithm – Search Engine Watch https://searchenginewatch.com Thu, 03 Nov 2022 16:07:44 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 Is Google headed towards a continuous “real-time” algorithm? https://searchenginewatch.com/2022/11/03/is-google-headed-towards-a-continuous-real-time-algorithm/ Thu, 03 Nov 2022 16:07:44 +0000 https://www.searchenginewatch.com/?p=144261

Is Google headed towards a continuous “real-time” algorithm

30-second summary:

  • The present reality is that Google presses the button and updates its algorithm, which in turn can update site rankings
  • What if we are entering a world where it is less of Google pressing a button and more of the algorithm automatically updating rankings in “real-time”?
  • Advisory Board member and Wix’s Head of SEO Branding, Mordy Oberstein shares his data observations and insights

If you’ve been doing SEO even for a short while, chances are you’re familiar with a Google algorithm update. Every so often, whether we like it or not, Google presses the button and updates its algorithm, which in turn can update our rankings. The key phrase here is “presses the button.” 

But, what if we are entering a world where it’s less of Google pressing a button and more of the algorithm automatically updating rankings in “real-time”? What would that world look like and who would it benefit? 

What do we mean by continuous real-time algorithm updates?

It is obvious that technology is constantly evolving but what needs to be made clear is that this applies to Google’s algorithm as well. As the technology available to Google improves, the search engine can do things like better understand the content and assess websites. However, this technology needs to be interjected into the algorithm. In other words, as new technology becomes available to Google or as the current technology improves (we might refer to this as machine learning “getting smarter”) Google, in order to utilize these advancements, needs to “make them a part” of its algorithms.

Take MUM for example. Google has started to use aspects of MUM in the algorithm. However, (at the time of writing) MUM is not fully implemented. As time goes on and based on Google’s previous announcements, MUM is almost certainly going to be applied to additional algorithmic tasks.  

Of course, once Google introduces new technology or has refined its current capabilities it will likely want to reassess rankings. If Google is better at understanding content or assessing site quality, wouldn’t it want to apply these capabilities to the rankings? When it does so, Google “presses the button” and releases an algorithm update. 

So, say one of Google’s current machine-learning properties has evolved. It’s taken the input over time and has been refined – it’s “smarter” for lack of a better word. Google may elect to “reintroduce” this refined machine learning property into the algorithm and reassess the pages being ranked accordingly.    

These updates are specific and purposeful. Google is “pushing the button.” This is most clearly seen when Google announces something like a core update or product review update or even a spam update. 

In fact, perhaps nothing better concretizes what I’ve been saying here than what Google said about its spam updates

“While Google’s automated systems to detect search spam are constantly operating, we occasionally make notable improvements to how they work…. From time to time, we improve that system to make it better at spotting spam and to help ensure it catches new types of spam.” 

In other words, Google was able to develop an improvement to a current machine learning property and released an update so that this improvement could be applied to ranking pages. 

If this process is “manual” (to use a crude word), what then would continuous “real-time” updates be? Let’s take Google’s Product Review Updates. Initially released in April of 2021, Google’s Product Review Updates aim at weeding out product review pages that are thin, unhelpful, and (if we’re going to call a spade a spade) exists essentially to earn affiliate revenue.

To do this, Google is using machine learning in a specific way, looking at specific criteria. With each iteration of the update (such as there was in December 2021, March 2022, etc.) these machine learning apparatuses have the opportunity to recalibrate and refine. Meaning, they can be potentially more effective over time as the machine “learns” – which is kind of the point when it comes to machine learning. 

What I theorize, at this point, is that as these machine learning properties refine themselves, rank fluctuates accordingly. Meaning, Google allows machine learning properties to “recalibrate” and impact the rankings. Google then reviews and analyzes and sees if the changes are to its liking. 

We may know this process as unconfirmed algorithm updates (for the record I am 100% not saying that all unconfirmed updates are as such). It’s why I believe there is such a strong tendency towards rank reversals in between official algorithm updates. 

It’s quite common that the SERP will see a noticeable increase in rank fluctuations that can impact a page’s rankings only to see those rankings reverse back to their original position with the next wave of rank fluctuations (whether that be a few days later or weeks later). In fact, this process can repeat itself multiple times. The net effect is a given page seeing rank changes followed by reversals or a series of reversals.  

across the board fluctuations - Google moving towards a “real-time” algorithm

A series of rank reversals impacting almost all pages ranking between position 5 and 20 that align with across-the-board heightened rank fluctuations 

This trend, as I see it, is Google allowing its machine learning properties to evolve or recalibrate (or however you’d like to describe it) in real-time. Meaning, no one is pushing a button over at Google but rather the algorithm is adjusting to the continuous “real-time” recalibration of the machine learning properties.

It’s this dynamic that I am referring to when I question if we are heading toward “real-time” or “continuous” algorithmic rank adjustments.

What would a continuous real-time google algorithm mean? 

So what? What if Google adopted a continuous real-time model? What would the practical implications be? 

In a nutshell, it would mean that rank volatility would be far more of a constant. Instead of waiting for Google to push the button on an algorithm update in order to rank to be significantly impacted as a construct, this would simply be the norm. The algorithm would be constantly evaluating pages/sites “on its own” and making adjustments to rank in more real-time. 

Another implication would be a lack of having to wait for the next update for restoration. While not a hard-fast rule, if you are significantly impacted by an official Google update, such as a core update, you generally won’t see rank restoration occur until the release of the next version of the update – whereupon your pages will be evaluated. In a real-time scenario, pages are constantly being evaluated, much the way links are with Penguin 4.0 which was released in 2016. To me, this would be a major change to the current “SERP ecosystem.” 

I would even argue that, to an extent, we already have a continuous “real-time” algorithm. In fact, that we at least partially have a real-time Google algorithm is simply fact. As mentioned, In 2016, Google released Penguin 4.0 which removed the need to wait for another version of the update as this specific algorithm evaluates pages on a constant basis. 

However, outside of Penguin, what do I mean when I say that, to an extent, we already have a continuous real-time algorithm? 

The case for real-time algorithm adjustments

The constant “real-time” rank adjustments that occur in the ecosystem are so significant that they refined the volatility landscape. 

Per Semrush data I pulled, there was a 58% increase in the number of days that reflected high-rank volatility in 2021 as compared to 2020. Similarly, there was a 59% increase in the number of days that reflected either high or very high levels of rank volatility: 

Data showing volatility - Google moving towards a “real-time” algorithm

Simply put, there is a significant increase in the number of instances that reflect elevated levels of rank volatility. After studying these trends and looking at the ranking patterns, I believe the aforementioned rank reversals are the cause. Meaning, a large portion of the increased instances in rank volatility are coming from what I believe to be machine learning continually recalibrating in “real-time,” thereby producing unprecedented levels of rank reversals. 

Supporting this is the fact (that along with the increased instances of rank volatility) we did not see increases in how drastic the rank movement is. Meaning, there are more instances of rank volatility but the degree of volatility did not increase. 

In fact, there was a decrease in how dramatic the average rank movement was in 2021 relative to 2020! 

Why? Again, I chalk this up to the recalibration of machine learning properties and their “real-time” impact on rankings. In other words, we’re starting to see more micro-movements that align with the natural evolution of Google’s machine-learning properties. 

When a machine learning property is refined as its intake/learning advances, you’re unlikely to see enormous swings in the rankings. Rather, you will see a refinement in the rankings that align with refinement in the machine learning itself. 

Hence, the rank movement we’re seeing, as a rule, is far more constant yet not as drastic. 

The final step towards continuous real-time algorithm updates

While much of the ranking movement that occurs is continuous in that it is not dependent on specific algorithmic refreshes, we’re not fully there yet. As I mentioned, much of the rank volatility is a series of reversing rank positions. Changes to these ranking patterns, again, are often not solidified until the rollout of an official Google update, most commonly, an official core algorithm update. 

Until the longer-lasting ranking patterns are set without the need to  “press the button” we don’t have a full-on continuous or “real-time” Google algorithm. 

However, I have to wonder if the trend is not heading toward that. For starters, Google’s Helpful Content Update (HCU) does function in real-time. 

Per Google

Our classifier for this update runs continuously, allowing it to monitor newly-launched sites and existing ones. As it determines that the unhelpful content has not returned in the long-term, the classification will no longer apply.”

How is this so? The same as what we’ve been saying all along here – Google has allowed its machine learning to have the autonomy it would need to be “real-time” or as Google calls it, “continuous”: 

This classifier process is entirely automated, using a machine-learning model.” 

For the record, continuous does not mean ever-changing. In the case of the HCU, there’s a logical validation period before restoration. Should we ever see a “truly” continuous real-time algorithm, this may apply in various ways as well. I don’t want to let on that the second you make a change to a page, there will be a ranking response should we ever see a “real-time” algorithm.

At the same time, the “traditional” officially “button-pushed” algorithm update has become less impactful over time. In a study I conducted back in late 2021, I noticed that Semrush data indicated that since 2018’s Medic Update, the core updates being released were becoming significantly less impactful.

the relation between Google's updates and rank volatility - Google moving towards a “real-time” algorithm

Data indicates that Google’s core updates are presenting less rank volatility overall as time goes on

Subsequently, this trend has continued. Per my analysis of the September 2022 Core Update, there was a noticeable drop-off in the volatility seen relative to the May 2022 Core Update

lesser rank volatility seen during Google's core update in Sep 2022 - Google moving towards a “real-time” algorithm

Rank volatility change was far less dramatic during the September 2022 Core Update relative to the May 2022 Core Update 

It’s a dual convergence. Google’s core update releases seem to be less impactful overall (obviously, individual sites can get slammed just as hard) while at the same time its latest update (the HCU) is continuous. 

To me, it all points towards Google looking to abandon the traditional algorithm update release model in favor of a more continuous construct. (Further evidence could be in how the release of official updates has changed. If you look back at the various outlets covering these updates, the data will show you that the roll-out now tends to be slower with fewer days of increased volatility and, again, with less overall impact). 

The question is, why would Google want to go to a more continuous real-time model? 

Why a continuous real-time google algorithm is beneficial

A real-time continuous algorithm? Why would Google want that? It’s pretty simple, I think. Having an update that continuously refreshes rankings to reward the appropriate pages and sites is a win for Google (again, I don’t mean instant content revision or optimization resulting in instant rank change).

Which is more beneficial to Google’s users? A continuous-like updating of the best results or periodic updates that can take months to present change? 

The idea of Google continuously analyzing and updating in a more real-time scenario is simply better for users. How does it help a user looking for the best result to have rankings that reset periodically with each new iteration of an official algorithm update? 

Wouldn’t it be better for users if a site, upon seeing its rankings slip, made changes that resulted in some great content, and instead of waiting months to have it rank well, users could access it on the SERP far sooner? 

Continuous algorithmic implementation means that Google can get better content in front of users far faster. 

It’s also better for websites. Do you really enjoy implementing a change in response to ranking loss and then having to wait perhaps months for restoration? 

Also, the fact that Google would so heavily rely on machine learning and trust the adjustments it was making only happens if Google is confident in its ability to understand content, relevancy, authority, etc. SEOs and site owners should want this. It means that Google could rely less on secondary signals and more directly on the primary commodity, content and its relevance, trustworthiness, etc. 

Google being able to more directly assess content, pages, and domains overall is healthy for the web. It also opens the door for niche sites and sites that are not massive super-authorities (think the Amazons and WebMDs of the world). 

Google’s better understanding of content creates more parity. Google moving towards a more real-time model would be a manifestation of that better understanding.

A new way of thinking about google updates

A continuous real-time algorithm would intrinsically change the way we would have to think about Google updates. It would, to a greater or lesser extent, make tracking updates as we now know them essentially obsolete. It would change the way we look at SEO weather tools in that, instead of looking for specific moments of increased rank volatility, we’d pay more attention to overall trends over an extended period of time. 

Based on the ranking trends we already discussed, I’d argue that, to a certain extent, that time has already come. We’re already living in an environment where rankings fluctuate far more than they used to and to an extent has redefined what stable rankings mean in many situations. 

To both conclude and put things simply, edging closer to a continuous real-time algorithm is part and parcel of a new era in ranking organically on Google’s SERP.


Mordy Oberstein is Head of SEO Branding at Wix. Mordy can be found on Twitter @MordyOberstein.

Subscribe to the Search Engine Watch newsletter for insights on SEO, the search landscape, search marketing, digital marketing, leadership, podcasts, and more.

Join the conversation with us on LinkedIn and Twitter.

]]>
Google Page Experience update is all set to launch in May 2021 – Webmasters, hang in there! https://searchenginewatch.com/2021/01/04/google-page-experience-update-is-all-set-to-launch-in-may-2021-webmasters-hang-in-there/ https://searchenginewatch.com/2021/01/04/google-page-experience-update-is-all-set-to-launch-in-may-2021-webmasters-hang-in-there/#respond Mon, 04 Jan 2021 14:38:21 +0000 https://www.searchenginewatch.com/?p=142697

30-second summary:

  • Google plans to update its algorithm in 2021 to include a factor called Page Experience.
  • This includes existing Google Search signals such as mobile-friendliness, safe-browsing, HTTPS, and intrusive interstitial guidelines.
  • It also includes metrics in Google’s Web Vitals to do with a site’s loading speed, interactivity, and visual stability.
  • For site owners and others, understanding these signals and making the necessary changes should be a priority.
  • Among the steps to take are optimizing for mobile, improving page speeds, CTAs, and alt text for images.

We’re sure you’ve heard about Google’s announcement this summer. Yes, they’ve made another one. In brief, they said that they’re going to update their algorithm in 2021 to include a factor called Page Experience. This is going to be an important element that has an impact on rankings.

As part of this initiative, they’ve launched Web Vitals – a series of benchmarks essential to measuring and enhancing the user experience on the web.

Hold on. What is Page Experience, anyway? And do you really need to add to your overflowing to-do list? Let’s take a closer look.

The page experience in a nutshell

Page experience includes all aspects of how users interact with a web page and how good or painful it is for them. (In your case, we hope it isn’t the latter!)

This includes existing Google Search signals: mobile-friendliness, safe-browsing, HTTPS, and intrusive interstitial guidelines.

It also includes metrics in Google’s Web Vitals. Currently, the focus is on three facets: loading, interactivity, and visual stability. 

  1. Loading, in this context, measures perceived load speed. That’s the point in the page load timeline when the main content is likely to have loaded.
  2. Interactivity is the time from when a user first interacts with a page – a click or a tap, for example — to the time when the browser begins processing that interaction.
  3. Visual stability has to do with preventing annoying and unexpected movement of page content.

Google's Page Experience update explained

Source

You may already have optimized for some of these factors. According to Google’s own earlier research, as page load time goes from one second to 10 seconds, the probability of a mobile site visitor bouncing increases by 123%. Ouch!

Google's Page Experience update explained - page speed stat

Similarly, as the number of elements on a page goes from 400 to 6,000, the probability of conversion drops by as much as 95%.

Now, Google is bringing these and other aspects together under one umbrella that is going to have even more of an impact on organic search results.

Visual indicators of page experience

Google has also stated that by next year, they will introduce a visual indicator to designate those search results that meet all of their page experience specifications. 

They’ve done something like this in the past, too. You must have observed, for example, AMP icons as well as slow and mobile-friendly labels.

If this indicator is displayed prominently in search results, there are good chances that users will prefer these sites over others. 

While Google is yet to announce the shape, size, and position of such indicators, it’s a mark of how seriously they’re taking their forthcoming page experience guidelines. 

This means that all of us should start planning from now itself.

Hold on. Page experience isn’t everything.

Now, you may have read this far and decided that the most important thing is to fix all of the above parameters. And you’ll see your traffic zoom.

That won’t necessarily be the case. (Although we hope it is!) You see, content is still king. Everything starts with that.

As Google themselves point out in their blog,

“Great page experience doesn’t override having great page content.”

However, you can rest assured that when there are many pages that are similar in relevance, your improved page experience will make all the difference in search results.

Why you should pay attention to this algorithm update 

The fact remains that the new page experience metrics should be taken seriously by developers and all those involved in optimization strategies to improve search rankings.

To begin with, if your user experience is seen as being in the top bracket, visual cues will guide consumers and browsers to your page over the others.

Google itself is pretty clear about the increased weightage they’re going to give to page experience. After all, a terrific page experience lets people get more done and increases engagement.

It seems evident that those pages which fall below the new benchmarks are going to be left behind in the rankings. This means a significant drop in traffic.

Google already considers hundreds of aspects to determine rankings. The inclusion of page experience lets them guide people, so they can access information more easily and enjoyably.

For site owners and others, understanding these signals and making the necessary changes should be a priority.

Otherwise, you run the risk of your page being ignored. You wouldn’t want that now, would you?

Let’s start with a bad page experience

Before we get down to understanding how to improve page experience, let’s understand what a bad page experience is.

  • Slow page speeds: You know how frustrating it can be to click on a search result and then wait for a page to load. It may be a few seconds, but it feels like an eternity. Chances are, your consumers feel the same way and are put off.
  • Bad structure and design: Even if the page loads quickly, there are times when it can be confusing to navigate. This could be because the design is cliched or just puzzling. There could be too many pop-ups. There could be no proper content structure. Looking for information here could be like looking for a needle in a haystack.
  • Lack of engagement. Unfortunately, too many websites simply assume that their only purpose is to sell. But today’s consumer wants to be engaged with, wants to be entertained, and wants to be understood. That’s why empathy and likeability are important factors.

The steps you can take

There are more than six months to go before these changes take effect. As a webmaster, you have more than enough time to prepare. And there are no excuses for not being ready.

As a site owner or stakeholder, you can take the advice of Aja Frost, head of content SEO at HubSpot. This is what he says: “I think this gives you good ammunition to go to your web team or your performance team and say, ‘Hey, you know, Google…[is] going to release this in six months, and so we need to focus on it.’

Here are some things to consider.

  • You can start by gaining an understanding of the metrics that Google is going to use. For now, these are LCP (Largest Contentful Paint), CLS (Cumulative Layout Shift), and FID (First Input Delay). Google itself provides explanations and standards of measurement, which are useful in gaining mastery of them.
  • Based on this, you can then conduct a site audit. Optimize for these new ranking signals, especially factors such as page load speeds, responsiveness, UX, mobile usability, and security. There are a variety of tools that you can use for this. For example, Google’s online mobile-friendly test, as well as Page Speed Insights, which play the role of performance checkers across all devices.
  • As you know, it takes several individuals working together to create a high-quality website. It’s time to bring these stakeholders together and discuss how this algorithm update is going to be handled. The SEO, UX design, and IT teams should be in perfect alignment when it comes to future goals and actions. You could start by asking them if they’d prefer coffee or tea, and then get the meeting started.

Expert tips to boost page experience

Until now, you’ve received a broad overview of Google’s announcement, what it means for developers and other stakeholders, and some initial steps you can take to prepare.

This is all very informative, but is there anything you can do right away? Yes, there is.

Here are some granular details as to how you can enhance page experience like a boss.

1. Optimize for mobile search

In Q3 2020, mobile devices generated 50.81% of global website traffic, consistently hovering around the 50% mark since the beginning of 2017.

Google's Page Experience update - How to continue to rank - Optimize for mobile

Source

Clearly, these numbers can’t be ignored. Google’s algorithms, too, primarily use the mobile version of a site’s content to rank pages from that site.

If you haven’t already, you should get your page mobile-ready by reducing code, leveraging browser caching, and reducing redirects.

The webpage design should be simple and responsive so as to appear attractive on smaller screens. Site structure, too, should be optimized for mobile.

2. Improve page speeds

According to recent research that confirms Google’s findings, a delay of one full second in loading can decrease conversion rates by 70%. Just one second — shorter than the time it’s taken to read this sentence.

There are several ways to not lose out because of frustrating delays. As per Google, the best practice is just three seconds. 

One way to do this is to minimize HTTP requests. That’s because the components on the page, the longer it takes for it to render. You can combine files to overcome this.

Further, asynchronous loading files can speed up pages. When a browser loads a page, it moves from top to bottom, and you can use this to your advantage.

Other aspects include examining the operation of JavaScript loading and server response times. Don’t forget to also check on compression, caching, and, importantly, image file sizes. 

3. Separate CTAs

Optimizing for mobile and improving page speeds are the first steps to take, as they have a huge impact on user experience. However, there are other factors that can further improve interaction, not to mention the conversion rate.

One of these is the Call to Action or CTA. Virtually every site has these in some form or another. Consumers are requested to take specific actions, from subscribing to updates, signing up, asking for an appointment, and, of course, making purchases. (Let’s not forget that.)

Google's Page Experience update - How to continue to rank - Use separate CTAs

Source

The trick is to realize that consumers have different frames of mind at different points so as to be able to customize your CTAs, accordingly.

They should be short, specific, and clear about the action needed. Ideally, they should include a benefit. Think of what the consumer will get out of the interaction. Is it to be enlightened, to succeed, or to solve a problem?

The design of CTA buttons is important, too. Naturally, they should be bright, correctly-shaped, and properly positioned.

Think of that as your own call to action.

4. Use Alt Text for images

We’ve already touched upon image compression as a way of providing an optimal loading experience. But there’s another factor involved when it comes to experience as well as page ranking.

This is called an alt text. It’s used in an HTML code, and it describes the appearance and function of an image on a page.

Such alt tags will be displayed in case the image file isn’t loaded so that users understand the context. Such descriptions are also used by search engine crawlers for indexing, and this helps in rankings.

These alt text descriptions should be short, specific, and ideally with a keyword. This will go a long way in helping your site’s organic search results.

Finally, this can’t be repeated enough: Focus on the content

This is something we’ve touched upon earlier, but it’s so important that we want to remind you once again.

It sometimes happens that people get so caught up in the metrics and technical issues of SEO that the most important element gets pushed to second place.

Quite simply, good content will always play a critical role in determining page rankings. It should be simple, it should answer a need, and it should be unique.

It’s when you have such content, and then optimize it for Google’s algorithm updates, that you’re going to see your ranking zoom to the top.

Aayush Gupta is Sr. Manager, Brand & Marketing at Uplers. He likes to stay on his toes when it comes to marketing and doing things worth risk-taking. He loves traveling and exploring local cuisines. In his free time reading books with coffee is all he wants.

]]>
https://searchenginewatch.com/2021/01/04/google-page-experience-update-is-all-set-to-launch-in-may-2021-webmasters-hang-in-there/feed/ 0
Seven reasons why your rankings dropped and how to fix them https://searchenginewatch.com/2019/05/20/seven-reasons-why-your-rankings-dropped-and-how-to-fix-them/ https://searchenginewatch.com/2019/05/20/seven-reasons-why-your-rankings-dropped-and-how-to-fix-them/#respond Mon, 20 May 2019 13:54:18 +0000 https://www.searchenginewatch.com/?p=128220 Do you know the triumph when your content finally hits the first page of Google and attracts significant traffic? Unfortunately, nobody is safe from a sudden drop in rankings. The thing is that the reasons for it may be different and not obvious at all.

In this post, you’ll discover what could cause a sudden drop in traffic and how to fix the issue.

The tip of an iceberg

Unfortunately, there’s no one size fits all decision, when it comes to SEO. When you face the drop in your rankings or traffic, it’s just the tip of an iceberg. So, get ready to check lots of issues, before you identify the problem.

Graph on issues that cause ranking drops

Note: Percentages assigned in the above graph are derived from personal observation.

I’ve illustrated the most common reasons for a plummet. Start from checking these parameters to find out how you can recover your rankings and drive traffic to your website.

Algorithms test

First of all, check the SERP. What if it’s not only your website that changed its positions in search results? These sharp shifts may happen when Google tests its algorithms. In this case, you don’t even have to take any further steps, as the rankings will be restored soon.

If you track your rankings with Serpstat, you can analyze your competitors’ positions as well. It’ll help you understand whether the SERP was changing a lot lately. From the moment you create a new project, the tool starts tracking the history of top-100 search rankings’ changes for the selected keywords. The “Storm” graph illustrates the effect of the changes that have occurred in the search results.

The "Storm" graph that illustrates the factors causing the ranking drop

On this chart, you see that for the “cakes for dads” keyword the storm score was pretty high on 21st March. Now, let’s look at how the top-10 positions that were changing on this date.

Graph showing a phrase-wise rise and drop in the SERP

The graph shows a sharp drop and rise that occurred in most of the positions. In a few days, all the rankings were back to normal again.

This example tells us that whenever you witness a significant drop in your search rankings, you should start with analyzing the whole SERP. If there’s a high storm score, all you need to do is to wait a bit.

In case you checked your competitors’ positions and didn’t see any movements, here’s the next step for you.

Technical issues

Technical SEO affects how search robots crawl and index your site’s content. Even though you have optimized your website technically, every time you add or remove some files or pages, the troubles may occur. So, make sure you’re aware of technical SEO issues on your site. With Google’s URL Inspection tool, you can check the way search engines see your website.

These are the main factors crucial for your rankings:

1. Server overload

If your server isn’t prepared for traffic surges, it can take your site down any minute. To fix this problem, you can add a CDN on your website or cache your content, set up a load balancer, or set up a cloud hosting,

2. Page speed

The more the images, files, and pop-ups you add to your content, the more time it takes for your pages to get loaded. Mind that page speed isn’t only a ranking factor, but it also influences user experience. To quickly check the issue, you can go with Google’s PageSpeed Insights. And to speed up your website, you can:

  • Minimize HTTP requests or minify and combine files
  • Use asynchronous loading for CSS and JavaScript files
  • Defer JavaScript loading
  • Minimize time to first byte
  • Reduce server response time
  • Enable browser caching
  • Reduce image sizes
  • Use CDN again
  • Optimize CSS delivery
  • Prioritize above-the-fold content (lazy loading)
  • Reduce the number of plugins you use on your site
  • Reduce redirects and external scripts
  • Monitor mobile page speed

3. Redirections

It’s the most common cause of lost rankings. When you migrate to a new server or change the structure of your site, never forget to set up 301 redirects. Otherwise, search engines will either fail to index your new pages or even penalize your site for duplicate content.

Detecting site errors can be quite difficult especially if it’s located solely on one page. Inspecting every page would be time-consuming. Also, it’d be very costly if you’re running a business. To speed up the process of identifying such errors you can use different SEO tools and site audit tools, like Serpstat, OnCrawl, and other such ones.

 

Wrong keywords

Are you using the right keywords? If you hadn’t considered user intent when collecting the keywords, it might have caused some problems. Even if your site was ranking high for these queries for some time, Google could have changed the way it understands your site’s intent.

I’ll provide two examples to illustrate the issue.

Case one

There’s a website of an Oxford Summer School named “oxford-royale.co.uk”. The site didn’t contain any long-form descriptions but services pages. Once Google began to rank the website for queries with informational intent, SEO experts noticed the traffic dropped. After they added more texts to the service pages, they succeeded in fixing the problem.

Case two

This case occurred to a flower delivery agency. While the website was ranking for transactional queries, everything was alright. Then Google decided the site better suits informational intent. To restore the site’s rankings, SEOs had to add keywords with high transactional intent, such as “order”, “buy”, and many such keywords.

To collect the keywords that are right for your business goals, you can use KWFinder. With the tool, you can identify relevant keywords that you can easily rank for.

Screenshot of a suitable keywords' list in KWFinder

Outdated content

This paragraph doesn’t require long introductions. If your content isn’t fresh and up-to-date anymore, people won’t stay long on your site. Moreover, outdated content doesn’t attract shares and links. All these aspects may become good reasons for search engines to reduce your positions.

There’s an easy way to fix it. Update your content regularly and promote it not to lose traffic. The trends keep changing, and if you provided a comprehensive guide on the specific topic, you don’t want it to become outdated. Instead of creating a new guide every time, update the old one with new data.

Lost links

Everybody knows your link profile is a crucial part of your site’s SEO. Website owners take efforts to build quality links to the new pieces of content. However, when you managed to earn a large number of backlinks, you shouldn’t stop monitoring your link profile.

To discover whether your link profile has undergone any changes for the last weeks, go with Moz or Majestic. The tools will provide you with data on your lost and discovered links for the selected period.

Screenshot of discovered and lost linking domains in Moz

If you find out you’ve lost the links from trustworthy sources, try to identify the reasons why these links were removed. In case they’re broken, you can always fix them. If website owners removed your links by chance (for example, when updating their websites), then ask them to restore links. If they did it intentionally, no one can stop you from building new ones.

Poor user experience

User experience is one more thing crucial for your site’s rankings. If it had started ranking your page high on search results and then noticed it didn’t meet users’ expectations, your rankings could have suffered a lot.

Search engines usually rely on metrics such as the click-through rate, time spent on your page, bounce rate, the number of visits, and more. That’s why you should remember the following rules when optimizing your site:

1. Provide relevant metadata

As metadata is used to form snippets, it should contain relevant descriptions of your content. First of all, if they aren’t engaging enough, users won’t click-through them and land on your site. On the other hand, if your snippets provide false promises, the bounce rate will increase.

2. Create an effective content structure

It should be easy for users to extract the necessary information. Most of your visitors pay attention to your content structure when deciding whether they’ll read the post.

Break the texts into paragraphs and denote the main ideas in the subheadings. This step will help you engage visitors looking for the answer to their very specific questions.

3. Avoid complicated design and pop-ups

The content isn’t the only thing your audience looks at. People may also decide to leave your website because of irritating colors, fonts, or pop-up ads. Provide simple design and minimize the number of annoying windows.

Competition from other websites

What if none of the steps worked? It might mean that your rankings dropped because your competitors were performing better. Monitor changes in their positions and identify the SERP leaders.

You can analyze your competitors’ strategies with Serpstat or Moz. With these tools, you can discover their backlink sources, keywords they rank for, top content, and more. This step will help you come up with ideas of how you could improve your own strategy.

Never stop tracking

You can’t predict whether your rankings will drop one day. It’s much better to notice the problem before you’ve already lost traffic and conversions. So, always keep tracking your positions and be ready to react to any changes quickly.

Inna Yatsyna is a Brand and Community Development Specialist at Serpstat. She can be found on Twitter @erin_yat.

]]>
https://searchenginewatch.com/2019/05/20/seven-reasons-why-your-rankings-dropped-and-how-to-fix-them/feed/ 0
What were Google’s biggest search algorithm updates of 2018? https://searchenginewatch.com/2019/01/08/google-search-algorithm-updates-of-2018-infographic/ https://searchenginewatch.com/2019/01/08/google-search-algorithm-updates-of-2018-infographic/#respond Tue, 08 Jan 2019 14:44:59 +0000 https://www.searchenginewatch.com/?p=117127 Search came a long way this past year. We saw the appearance of the zero-result SERP, featuring knowledge cards for answers such as conversions and times.

We welcomed the mobile-first index and the mobile speed update. With the focus on mobile, we saw meta description lengths shorten from 300+ to 150 or so.

We saw minor changes to image search and a renewed emphasis on “compelling and shareable content.” After testing video carousels on desktop SERPs for a while, Google decided to roll the feature out by replacing video thumbnails with video carousels across the board. Understandably, we’ve since seen more focus on producing video.

Some algorithm updates occurred overnight, some happened incrementally. Some caused only ripples, and some turned the SERPs updside down.

As we say hello to 2019, we want to take a moment to reflect on this past year. The algorithm changes we saw last year can be indicators of changes or trends to come. Search engines often make incremental adjustments to their filters.

So, our friends over at E2M have created a visual and entertaining overview of what went down in Google Search over 2018 — and which might help give us an idea of where we’re going next.

Google’s biggest Search algorithm updates of 2018 – A visual representation by E2M

Google”s Biggest Search Algorithm Updates Of 2018

]]>
https://searchenginewatch.com/2019/01/08/google-search-algorithm-updates-of-2018-infographic/feed/ 0
An updated look at Alpha-Beta in a world of close variants https://searchenginewatch.com/2018/11/19/alpha-beta-close-variant-update/ https://searchenginewatch.com/2018/11/19/alpha-beta-close-variant-update/#respond Mon, 19 Nov 2018 14:30:40 +0000 https://www.searchenginewatch.com/?p=116167 Google has recently updated exact match close variants; now implied words, paraphrase words and same-intent keywords are allowed to match to exact match keywords as close variants.

This change gives advertisers less control over their exact match keywords and gives Google more control to appropriately match search query intent to similar keywords.

For advertisers using (or planning to use) the Alpha-Beta structure for their accounts, there are a few considerations to be aware of with this update to exact match close variants.

Background on Alpha-Beta

The Alpha Beta structure breaks keywords into two different campaigns – Alpha and Beta. Alpha campaigns contain exact match keywords that are strong-performing search queries. Top-performing search queries can be determined by a few different metrics (conversions, impressions, clicks, etc.) depending on your business targets and goals. These Alpha keywords are placed into single keyword ad groups, which allows for keywords with hyper-targeted landing pages and ad copy. Single keyword ad groups allow the landing page and ad copy to be as relevant to a user’s search as possible, increasing the likelihood that a user will click on the ad. CPCs are typically lower with this structure as well because of the relevancy of the ad copy and landing page to a user’s search intent.

Beta campaigns contain broad match modifier (BMM) keywords; these give advertisers more control than broad match keywords and are more inclusive than phrase match keywords. These Beta campaigns allow advertisers to mine for additional top-performing keywords. In order to continually identify top-performing search queries to promote to Alpha, you need to monitor Beta search queries on a recurring basis. It is also important to mine search query reports to eliminate poor-performing search queries that are not driving conversions or are irrelevant to your business.

Once the Alpha-Beta campaign structure is established, ensure that exact match traffic is funneled to your Alpha campaigns, not Beta campaigns, by adding all Alpha keywords as exact match negatives to your Beta campaigns.

This campaign structure allows advertisers to maintain control of: the search queries that appear on the SERP, the message delivered to consumers, treatment of top performers, and easy negation of underperforming or irrelevant queries.

Close variant changes

Google has made a few changes to exact match close variant targeting over the past few years. You might remember the change in March 2017 that allowed for exact match keywords to show for typos, plurals, and other close variants as long the meaning was similar. (Many advertisers experienced little impact from this change.) The most recent change, which occurred in late September, allowed for implied words, paraphrase words, and same-intent keywords to match to our exact match keywords as close variants, with the help of Google’s algorithm. Google’s stance was that the changes were released to be more inclusive of the constantly changing consumer search behavior; they’ve said that roughly 15% of searches seen every day are new.

Impact on Alpha-Beta structure

This change in close variant matching has impacted the way the Alpha-Beta structure is managed. There are two impacts that are important to consider: increases in spend on exact match keywords and poor exact match close variant matching.

As Google starts to consider additional variants for exact match, I would expect to see traffic start to increase to Alpha campaigns. With no corresponding budget increases, Alpha campaigns could hit some restrictions, so be sure to keep an eye on campaign budgets in the upcoming months.

It is now increasingly important to monitor close variant matching. The easiest way to monitor matching is to download a search query report and filter match type for exact match (close variant). You can also look into exact match keywords that are seeing significant increases in spend or traffic. If you experience that exact match variants are poorly matching, the solution is to add ad group negatives to control the funneling of these search queries. Note: Google’s algorithm might match search queries as exact match close variants even though that search query is built out as an exact match keyword. This is another circumstance where we would want to add ad group negatives to filter traffic to the most appropriate keywords.

There could also be instances where Google is matching top performing search queries to your exact match keywords as close variants. If the intent of these search queries is significantly different from the keyword that it is matching to, breaking out those search queries as keywords will allow for more control of traffic and messaging. For example, you would not want to break out sweater and sweaters into different single keyword ad groups because the intent is the same, so you can deliver the same ad copy messaging and the same landing page. You would want to break out and add an ad group negative for “b2b internal payment,” which was an exact match variant for “b2b international payment” (hmmm). These keywords have a very different intent and should never be grouped together.

In the new world of close variants, the Alpha-Beta structure still allows advertisers to maintain more control over search queries, landing pages, and messaging. But the latest update does impact some of the control over search queries matching to exact match keywords, making it more important to review search query reports regularly.

]]>
https://searchenginewatch.com/2018/11/19/alpha-beta-close-variant-update/feed/ 0
Google’s core algorithm update: Who benefited, who lost out, and what can we learn? https://searchenginewatch.com/2018/03/27/googles-core-algorithm-update-who-benefited-who-lost-out-and-what-can-we-learn/ https://searchenginewatch.com/2018/03/27/googles-core-algorithm-update-who-benefited-who-lost-out-and-what-can-we-learn/#respond Tue, 27 Mar 2018 14:36:56 +0000 https://www.searchenginewatch.com/2018/03/27/googles-core-algorithm-update-who-benefited-who-lost-out-and-what-can-we-learn/ There’s been much talk recently about Google implementing a broad core algorithm update.

A couple of weeks ago, webmasters started to notice changes to their search rankings which many suspected were due to an update to Google’s core algorithm. Google subsequently confirmed this via a tweet to its Search Liaison account, manned by former Search Engine Land editor and Search Engine Watch founder Danny Sullivan.

Google has suggested that this update has nothing to do with the quality of content, and instead focuses on improving the quality of the SERPs. At SMX West, Nathan Johns, a search quality analyst at Google, stated in an AMA session that the core update was designed to “reward under-rewarded sites” rather than award penalties.

At Pi Datametrics, our data on organic search rankings would tend to confirm this, as the only real losses we’ve seen – while dramatic – were generally short-lived, and occurred in the run-up to the update itself.

However, if Google wasn’t testing quality, what exactly were they testing?

I turned to the SERPs to have a look, going back in time to the period just before, during and after the recent update. I asked Google a relatively simple question, then analyzed the results to detect any rumblings or suspicious flux.

Testing the Google broad core algorithm update

Google Query: What’s the best toothpaste?

Google’s Broad Core Algorithm Update - Pi Datametrics test 1

 

Google’s Broad Core Algorithm Update - Pi Datametrics test key 1

I’ve focused primarily on content that was visible on page 1 or 2 at the start of this year.

We can clearly see that all these pages dropped out of the top 100 then reappeared on the same day. This occurred multiple times over a five week period.

Seven websites all performed pretty well (visible on page 1 and 2), with a further two sites appearing mid-way through the shake-up, that had no previous visibility (Expertreviews [dark pink] and Clevelandclinic [dark blue]).

The obvious shake-up started on 24 January, roughly five weeks before the algorithm was said to have fully rolled out (Sunday 4th March).

What we have here is a pattern we’ve seen many times before, something that is only visible with access to daily data on the entire SERPs landscape. It looks like a period of testing pre-full rollout, which is only to be expected.

Here’s the same chart, zoomed in from 01 February:

Google’s Broad Core Algorithm Update - Pi Datametrics test 2

In the chart above we can see the flux continuing from February 5 onwards. Every site involved experienced almost the exact same pattern of visibility loss.

Things finally settled down on March 8. At first glance, it looks like all sites regained their original positions.

However, on closer inspection we can see that all came out slightly worse off, by an average of just over two positions; the smallest drop being one position (which can be painful on page one) and the largest being six.

Knowing when to act and when to sit tight

If this chart says one thing, it is DON’T PANIC if you drop out of the top 100 for a term you care about!

Just keep monitoring the SERPs every day. If you’ve ruled out content cannibalization, it could well be a period of algorithm testing – as with the broad core update.

If you’ve put the searcher first and created the kind of rich content that will satisfy them, then the chances are you will recover from these testing times.

Or maybe, like the Expertreviews site above (following the injection of a long-form, socially popular and recently updated piece of content into their ecosystem), you could even move from nowhere to position three, nudging all others down a peg.

Content that matched user intent was safe

The only two websites entirely unaffected by all of this were Reviews.com and Which.co.uk, proving that the combination of first mover advantage, relevance and fantastic authority ensures high visibility and algorithmic stability:

Google’s Broad Core Algorithm Update - Pi Datametrics test 3

Google’s Broad Core Algorithm Update - Pi Datametrics test 3

So, the immediate questions are – who has benefited from this shake-up? What happened in the gaps between the spikes? Who’s lost out and why? Are we now seeing a SERP more aligned with the intent of the searcher?

Who benefited from the early shake-up?

It wasn’t Expertreviews or Clevelandclinic. They benefited later.

Let’s introduce some of the the momentary winners who gained visibility during the downtime of all either sites:

Google’s Broad Core Algorithm Update - Pi Datametrics test 4

Google’s Broad Core Algorithm Update - Pi Datametrics test key 4

Wins for Business Insider, Colgate and Amazon

  • Businessinsider.com benefited from the initial shake-up. It has some great content, but it’s not been updated since October 2017. It has been indexed all this time, but only really became visible when Google pushed the previously well positioned sites out. Result? It survived the shake-up and ended on page one.
  • The same happened to the Colgate page. Note its /en-us/ TLD. Arguably, it shouldn’t be visible in the UK anyway. This page only provided a list of toothpaste types e.g. ‘Fluoride’ or ‘Tartar control’ etc. This didn’t answer my question or match my intent. Result? Ended up dropping back to page five after the shake-up.
  • The Amazon page simply displays a list of its bestsellers in toothpaste. From a content perspective, it’s not that inspiring. Result? Ended up dropping back to page three.

So the question is – if I were searching for “What’s the best toothpaste?” which of these new pages would I prefer?

All pages are mobile friendly, but if I really wanted to know what the best toothpaste was, I’d definitely prefer to read the Businessinsider.com page – coincidentally the only page that moved up to page one following the shake-up and stayed there.

In other words, the only one to satisfy my intent was in fact the only page that remained visible post shake-up. This page, to me answers my question perfectly.

What do these insights tell us about the core update?

Based on our testing, we can deduce that this algorithm is concerned with optimizing search results to support user intent, rather than to audit quality.

Why?

  1. Losses were not drastic, meaning we can rule out a penalty of any kind.
  2. Of all winners, none appeared to rise as a result of content updates.
  3. Some sites with strong, relevant content seemingly lost rankings in Google UK, as they were intended for the US market. This suggests that Google was auditing relevancy factors beyond just content (i.e. location / tld), to serve the best results and satisfy user intent.

In this respect, Google’s core update was concerned with the nature rather than the quality of content.

What better way to test the match of nature with intent than by shaking up the SERPs for a couple of weeks to determine user reaction?

Should you panic when your content visibility nosedives?

If your content visibility drops, it’s always necessary to carry out checks to ensure you have done everything within your power to mitigate the issue.

In the face of an algorithm update (like the recent broad core update), however, the best advice is to do nothing but monitor the SERPs closely.

If it is algorithmic testing, you most certainly won’t be the only one involved. Other sites will follow the exact same pattern down to the day. That’s a big clue that it’s algorithmic rather than isolated. Talking to others within the SEO and webmaster communities can help you to affirm that yours isn’t an isolated incident, and that you aren’t on the receiving end of a penalty from Google.

Google has confirmed that sites that experienced ranking drops as a result of the broad core update aren’t necessarily doing anything wrong. As I stated at the beginning of this article, the losses that we did observe were short-lived and not drastic.

If you want to make sure that your content is insulated against future updates of this kind, focus on creating content that puts the searcher first and will satisfy user intent. But above all: don’t panic.

A version of this article was originally published to the Pi Datametrics blog.

]]>
https://searchenginewatch.com/2018/03/27/googles-core-algorithm-update-who-benefited-who-lost-out-and-what-can-we-learn/feed/ 0
Penguin 4.0: what does it mean for SEO practitioners? https://searchenginewatch.com/2016/09/28/penguin-4-0-what-does-it-mean-for-seo-practitioners/ https://searchenginewatch.com/2016/09/28/penguin-4-0-what-does-it-mean-for-seo-practitioners/#respond Wed, 28 Sep 2016 11:35:30 +0000 https://www.searchenginewatch.com/2016/09/28/penguin-4-0-what-does-it-mean-for-seo-practitioners/ As you’re no doubt aware, Google finally rolled out its Google 4.0 algorithm update at the end of last week.

Penguin is now part of Google’s core algorithm, penalising websites that use various black-hat link schemes to manipulate search rankings.

Other important changes include:

  • Penguin data is refreshed in real time, so any changes will be made as soon as the affected page has been recrawled and reindexed.
  • Penguin now devalues spam by adjusting ranking of the offending page, rather than affecting the whole site.

So how do these changes affect actual SEO practitioners? I asked a panel of experts and SEW contributors their views on Penguin 4.0, including:

Do you think the new version of Penguin is fairer? Do you think it’s an adequate deterrent when it comes to spammy link-building?

Kevin Gibbons: Yes, being realtime helps to set expectations as you won’t have to wait weeks or months for the next algorithm refresh to assess your link removals.

Of course Google’s algorithm is always a moving target – but it is becoming harder to be manipulated at scale. In some verticals it can even be a game of whoever doesn’t have the worst backlinks might win. Perhaps having a new domain with no link reputation isn’t a bad starting point any more!

Gerald Murphy: I think the algorithm is fairer. Think about it, you will be awarded for great content, instantly. I also think that, with the rise of AI, Google will soon be able to understand links more. A flower shop on Valentines Day, for example, will get away with more spam-like links but this won’t be the case in September. As links will be linked with behaviours.

Nikolay Stoyanov: Yes, I think that the real-time version of the Penguin algorithm will be fairer and will play a very positive role for the whole SEO community.

Google Penguin is now a part of the core algorithm and every change (either a negative or positive one) will happen very quickly (maybe not instantly but on a daily or weekly basis). After more than 700 days of waiting we can finally rest assured that whatever SEO mistakes we make we will be able to quickly fix afterwards.

penguin diving

This works both ways though. If we use some gray or black hat techniques Google will be able to catch us very quickly and punish us for not following its rules. So this is a double-edged sword.

Another great change with Penguin 4.0 is the fact that it became more “granular”. This means that whatever penalties hit our sites from now on they will impact separate pages on the site and not the whole domain in general.

I believe that this will be a positive thing as it will give us a better chance to fix those penalized pages and to learn from our mistakes without losing a huge amount of our organic traffic (like before).

Ideally, the latest Penguin update will benefit white hat SEO experts like myself and will help us take our SEO to the next level. Same goes with end users who will get better results to answer their search intent properly.

Conversely, black hat techniques (especially PBNs) will slowly become obsolete and will eventually stop working which is the ultimate goal.

Have you experienced any affect from the Penguin update?

Kevin Gibbons: None of our clients have seen any negative shifts in organic traffic.

However, in the past we have noticed trends of referral traffic dropping as a knock-on effect from blogs/forums/publishers that have been penalised and as a result of them having less traffic, there are fewer outbound link clicks.

The data we have so far is too early to highlight a trend, but it’s certainly one to keep a close eye on…

Nikolay Stoyanov: No, I haven’t seen any change on my site or my clients’ sites since Penguin 4.0 was launched. I guess it’s because I’m playing by the rules but also because it’s not been entirely rolled out yet. It’s way too soon to jump to conclusions.

How can webmasters best avoid the risk of being affected by Penguin?

Kevin Gibbons: Focus on building a brand, not links.

If your activity is just for link building, it will leave an SEO footprint. No-one wants that.

Aim to tell your story via content, data-driven analysis and knowledge – and amplify to a targeted audience via multiple channels; social media, paid search, digital PR etc…

Also monitor the links you have and audit these on a frequent basis. If you’re in a competitive industry, you may have to actively disavow negative links that have been built to your site that someone else has built!

penguins marching to war

Gerald Murphy: Data analysis is even more important to SEO. This most effecient way to analyse this update is to breakdown links by category, sub category, and page level, and then compare this with data, such as, visits, average blended rank, and revenue, for example.

Nikolay Stoyanov: Forget about shortcuts in SEO! There aren’t any. The only way to stay on the safe side and secure your brand, visitors and sales is if you do white hat SEO.

Write well researched and useful content and build quality links to it. That’s it! Nothing’s changed. Hopefully with the real-time Penguin that’s exactly what’s going to happen. Maybe not at once but eventually.

With Hummingbird and RankBrain we’re already seeing lots of positive changes in the SERPs from content perspective. Now’s the time to see the same when it comes to link building.

How, if at all, will this update change the way you work?

Kevin Gibbons: The update doesn’t change our process, the only thing it might do is re-affirm the message we have been on the right track by focusing on quality. We’re just hoping it catches some of our clients competitors out!

Nikolay Stoyanov: I wouldn’t say that Penguin 4.0 will change my work routine in any way. But I am pretty sure that there will be a much higher demand for quality link building services in the upcoming years due to this huge change in the SEO world.

Hopefully, more and more webmasters will start playing by the rules as they should be same for everyone. That’s fair!

Gerald Murphy: It won’t.

What future algorithm changes do you wish to see? Is there anything Google is ignoring?

Kevin Gibbons: There’s always been a gap between what Google says it’s algorithm does and what it actually does. Over recent years they’ve done a much better job at closing these, and most of the tactics that do work are often very short-term, which is enough to keep most brands away from them.

I would expect them to be looking at things such as:

  • Spammy link building at high velocity, which can still rewarded by Google.
  • Ecommerce site cloning can be a pain point, where Google starts to rank the phishing/fake site organically with the clients own content.
  • Redirected domains into sites/pages/new domains – some can be for legitimate reasons (re-brands/acquisitions) – but others are purely for short-term SEO boosts.
  • Mass content production, with many companies pumping out X amount of articles a day/week trying to show ‘freshness’ of content but not putting enough effort/resource into the quality of content. Long-term you’d expect panda to go against them, but short term it can work better than expected.

Gerald Murphy: AI integration with links to get a deeper analysis of behaviour, such as, seasonality and maybe even social media signals. Remember mobile is going to kill links because of our behaviour. Name me a user, sitting in the front room on their tablet or smartphone, reading another great blog who creates a HTML link. This is not in our behaviour.

Nikolay Stoyanov: I want to see all the black hat and grey hat methods dead. Starting with PBNs. I still see multiple sites ranking in top 5 or higher with low quality or PBN links that I can smell from a mile away. It’s high time that Google Penguin starts penalizing these websites like they deserve.

]]>
https://searchenginewatch.com/2016/09/28/penguin-4-0-what-does-it-mean-for-seo-practitioners/feed/ 0
Penguin 4.0 is finally here, Google confirms https://searchenginewatch.com/2016/09/23/penguin-4-0-is-finally-here-google-confirms/ https://searchenginewatch.com/2016/09/23/penguin-4-0-is-finally-here-google-confirms/#respond Fri, 23 Sep 2016 13:18:20 +0000 https://www.searchenginewatch.com/2016/09/23/penguin-4-0-is-finally-here-google-confirms/ After a couple of years waiting, and various algorithm fluctuations described as ‘normal turbulence’, Google has finally confirmed today that its Penguin algorithm update is rolling out in all languages.

The last update in 2014 – Penguin 3.0 – may have only affected less than 1% of US/UK searches, but that ultimately translated to 12 billion queries.

Here we’ll detail all the changes you can expect from Penguin 4.0 according to Google’s blog post. But first a little refresher…

What is Penguin?

According to Adam Stetzer in his post on the delayed Penguin update, Google first launched the Penguin update in April 2012 to catch sites spamming the search results. Specifically the ones who used link schemes to manipulate search rankings.

Penguin basically hunts down inorganic links; the ones bought or placed solely for the sake of improving search rankings.

Before Penguin, bad links were simply devalued and needed to be replaced in order to recover search rankings.

But according to Chuck Price, after Penguin, bad links became ‘toxic’, requiring a link audit and removal or disavow of spammy links and a Penguin refresh was usually required before one could see any signs of recovery. This could take a while.

Thankfully this one of the things addressed in today’s update…

What to expect from Penguin 4.0

The following improvements were among webmasters’ top requests to Google:

Penguin is now real-time

As we stated earlier, the list of sites affected by Penguin was only periodically refreshed at the same time.

According to Google..

“Once a webmaster considerably improved their site and its presence on the internet, many of Google’s algorithms would take that into consideration very fast, but others, like Penguin, needed to be refreshed.”

But now, Penguin data is refreshed in real time, so any changes will be made as soon as the affected page has been recrawled and reindexed.

Google also states that it not going to comment on future refreshes.

Penguin is now more granular

Penguin now devalues spam by adjusting ranking based on spam signals, rather than affecting ranking of the whole site.

So any penalties will be delivered to a specific page rather than an entire domain, which seems much fairer in the long run.

We’ll bring you more information and follow-up stories on the update as soon as we have more insight.

]]>
https://searchenginewatch.com/2016/09/23/penguin-4-0-is-finally-here-google-confirms/feed/ 0
So what the heck is RankBrain? https://searchenginewatch.com/2015/10/29/so-what-the-heck-is-rankbrain/ https://searchenginewatch.com/2015/10/29/so-what-the-heck-is-rankbrain/#respond Thu, 29 Oct 2015 11:53:00 +0000 https://www.searchenginewatch.com/2015/10/29/so-what-the-heck-is-rankbrain/ You won’t have failed to notice in the news this week certain headlines along the lines of “Google reveals new machine learning algorithm” or “Google turning its search over to artificial intelligence” or “The machines have taken over, run screaming for the hills.”

That last one is taken from my own street-pamphlet The Sensationalist Times.

The buzz (both warranted and unwarranted) comes from an interview published by Bloomberg a few days ago, in which Greg Corrado, a senior research scientist at Google stated, that a “very large fraction” of the millions of queries a second that people type into Google have been interpreted by an artificial intelligence system, nicknamed RankBrain.

It is also stated that RankBrain has now become the third-most important signal contributing to the result of a search query. This is pretty significant and you can understand the ensuing interest from anyone with even a passing interest in search.

As Corrado even humbly states himself…

“I was surprised… I would describe this as having gone better than we would have expected.”

So let’s answer a few questions based on the available evidence out there at the moment…

What is RankBrain?

It’s an artificial intelligence system developed by Google that helps process its search results using machine learning.

Machine learning is basically a form of AI, where computer programs are created that can teach themselves how to develop and change when exposed to different types of data (in this case, search queries).

How does it work?

Here’s where we refer back to Google’s senior research scientist…

RankBrain embeds vast amounts of written language into mathematical entities into a format that a computer can understand, then if RankBrain sees a word or phrase it isn’t familiar with, using this information it can ‘guess’ what words or phrases might have a similar meaning and filter the results accordingly.

What purpose does it serve?

Basically, it’s used for handling ambiguous or unique questions that have never been submitted to Google before.

According to Google, brand new queries make up to 15% of all searches a day and as Search Engine Land pointed out, Google processes 3bn searches per day, which means that 450m per day are entirely unique in nature.

Clearly there is a need to use machine learning to cope with the sheer demand of its users. RankBrain is also showing signs of improving on Google’s own search engineers. Those behind the search software were asked to look at various pages and predict which pages would be ranked at the top of the Google results. The humans guessed correctly 70% of the time, RankBrain guessed correctly 80% of the time.

Should I be terrified?

Yes absolutely. As I said in my street-pamphlet that you probably threw in the trash immediately after I forced it in your hand, “run for the hills.”

Really?

No. I’m sure everything will be fine.

]]>
https://searchenginewatch.com/2015/10/29/so-what-the-heck-is-rankbrain/feed/ 0
Gaming Google in 2015 https://searchenginewatch.com/2015/09/21/gaming-google-in-2015/ https://searchenginewatch.com/2015/09/21/gaming-google-in-2015/#respond Mon, 21 Sep 2015 12:30:00 +0000 https://www.searchenginewatch.com/2015/09/21/gaming-google-in-2015/ I recently read a blog post in which the author claimed, “The days of gaming Google are over.” What an absurd statement. If you are an SEO, gaming Google is what you do. As long as Google uses algorithms to serve up search results, there will always be a way to exploit it. The polite term that we use to describe such exploitation is search engine optimization.

The good news is, there will never be an easier time to use SEO to game Google than right now. Over time, it will become more difficult to reverse engineer ranking factors as the algorithm improves. The signals will become much larger in number and more subtle.

In a recent interview, Gary Illyes pitched the Google company line, “I see many, many websites that are not doing much SEO on their websites and they are doing remarkably well. If they can do it, then pretty much anyone can do it.”

gary-illyes-headshot

The reality is that it’s nearly impossible to do remarkably well in a competitive niche without engaging actively in SEO. This is a variation on the Matt Cutts credo, “Write for users, not search engines.” The fact of the matter is, you need to be writing for both.

Back to That Subtlety Thing

The Hummingbird algorithm ushered in the first wave of algorithm subtlety. Instead of relying strictly on keywords, this update factored in user intent to determine the full context of what a page is about. This changed how pages should be written and optimized in a rather profound way.

Keyword Density Alone: No Longer a Reliable Metric

There was a time that it was enough to just check out a competitor’s keyword density and replicate it when optimizing a page. Actually, if a targeted keyword phrase did not appear on the page at least once, it was highly unlikely – if not impossible – for it to appear in the Google SERPs. That has all changed with Google integrating Latent Semantic Indexing into the algorithm.

How Did Things Change?

Here’s an example. If you conduct a search on Google for “learn to code,” the following nine organic results appear on the first page (news results excluded):

chuck-price-chart

Some Interesting Things to Note:

  • The exact phrase “learn to code” does not appear in three of nine title tags.
  • The phrase “learn to code” appears nowhere in the text of the two bold results.

I find the search result pine.fm/LearnToProgram to be especially interesting, as the keyword phrase “learn to code” is not used in the title tag, the description, or even on the page.

chuck-price-kw
What this means is that Google has determined that “learn to program” is also a good match to the user query “learn to code.” In other words, just including a keyword in a title tag and X times on a page isn’t enough to reach page one. The page needs to be useful, and it needs to semantically earn the right to be there.

That’s what Gary Illyes was getting at when he said, “And what I see is that, in many cases, SEOs ‘over-SEO’ a website. They are trying to rank for keywords or terms that the site doesn’t have great content for.”

I would argue that is really under-SEOing a website. Creating killer content that ranks well is a core component of SEO in 2015. I am not suggesting that keyword density be ignored entirely as a metric. Using the median word count and keyword density percent are a good baseline for testing. For this reason, it’s my opinion that it is still worthwhile.

How to Make a Page Semantically Relevant

First and foremost, the page really does need to be useful and deserving of page one status. To determine what this means, review the top ranking pages and the related phrases on these pages. Look for patterns. What phrases are common among the top search results? What related phrases are appearing in search results? In the case of “learn to code,” this would include “tutorial,” “programmer,” and “programming,” among others.

Don’t forget to look at Google’s suggestions at the bottom of your search results:

google-search-results

I suspect the same algorithm powering these suggestions is also being used to serve up search results.

The Takeaway

Gaming Google will never be any simpler than it is today. However, what is involved is much different than what was included in previous years. High volume exploits – otherwise defined as lots of links and pages – have been replaced by high quality content and links to earn your position.

]]>
https://searchenginewatch.com/2015/09/21/gaming-google-in-2015/feed/ 0