Google – Search Engine Watch https://searchenginewatch.com Fri, 19 May 2023 13:00:05 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 Optimize Google’s new Interaction to Next Paint metric https://searchenginewatch.com/2023/05/19/optimize-googles-new-interaction-to-next-paint-metric/ Fri, 19 May 2023 13:00:05 +0000 https://www.searchenginewatch.com/?p=144546

30-second summary:

  • Good page speed and user experience help your site stand out in search results
  • The Interaction to Next Paint metric is replacing First Input Delay
  • You can improve make your site respond faster to user input by reducing CPU processing times

The Core Web Vitals are a set of metrics that Google has defined to measure how good a website’s user experience is. They first became a ranking signal in 2021.

While the metric definitions have been tweaked over time, the introduction of the Interaction to Next Paint metric is the biggest change since the launch of the Core Web Vitals initiative.

What is Interaction to Next Paint (INP)?

Interaction to Next Paint is a metric that evaluates how quickly your website responds to user interaction. It measures how much time elapses between the user input, like a button click, and the next time the page content refreshes (the “next paint”).

To rank better in Google this interaction delay should be less than 200 milliseconds. This ensures that the website feels responsive to users.

How are the Core Web Vitals changing?

Google has announced that Interaction to Next Paint will become one of the three Core Web Vitals metrics in March 2024. At that point a website that responds to user input too slowly could do worse in search result rankings.

INP will replace the current First Input Delay (FID) metric. While FID also measures responsiveness, it is more limited as it only looks at the first user interaction. It also only measures the delay until the input event starts being handled, rather than waiting until the user can see the result.

Currently only 64.9% of mobile websites do well on the Interaction to Next Paint metric and it will be harder to get a good INP score than a good First Input Delay score.

How can I measure the Interaction to Next Paint metric on my website?

Run a website speed test to see how fast your website loads and how quickly it responds to user input.

Open the “Web Vitals” tab once your test is complete. You can see the Interaction to Next Paint metric at the bottom of the page.

In this case only 38% of users have a good INP experience.

How can I optimize Interaction to Next Paint?

Interaction delays happen when the browser needs to perform a lot of CPU processing before it can update the page. This can happen for two reasons:

  • Ongoing background tasks prevent the user input from being handled
  • Handling the user input itself is taking a lot of time

Background tasks often happen during the initial page load, but can happen later on as well. They are often caused by third party code embedded on the website.

Responding to a user interaction can require a lot of processing. If that can’t be optimized you can consider showing a spinner to provide visual feedback until the processing task is complete.

Running JavaScript code is the most common type of processing, but complex visual updates can also take a long time.

Use Chrome DevTools to analyze performance

The Chrome DevTools performance profiler shows what tasks are taking a long time and should be optimized. Start a recording, click on an element on the page, and then click on the longest bars in the visualization.

This allows you to identify whether the code comes from a third party or from your own website. You can also dive deeper to see how the task can be sped up.

Check the Total Blocking Time metric to identify background tasks

The Total Blocking Time metric tracks how often there are background CPU tasks that could block other code from running. If the user interacts with the page while a task is already in progress then the browser first completes that task before handling the input event.

You can use tools like Google Lighthouse to see how this metric can be optimized.

If processing-heavy tasks on your website are part of your core website code you’ll need to work with your development team to optimize these. For third parties you can review whether the script is still needed, or contact customer support of the vendor to see if it’s possible to optimize the code.

Monitor Interaction to Next Paint

Want to keep track of how you’re doing on INP and other Core Web Vitals? DebugBear can keep track of your website speed and help you optimize it.

Start a free 14-day trial today and deliver a better user experience.

Conclusion

The Interaction to Next Paint metric represents the biggest change to Google’s Core Web Vitals since they were originally announced. INP addresses the deficiencies of the previous First Input Delay metric and provides a better representation of how users experience a website.

Check how your website does on the Interaction to Next Paint metric before the ranking change is rolled out in 2024. That way you’ll have plenty of time to identify optimizations and make your website faster.

Try DebugBear with a free 14-day trial.

]]>
The Search Engine Watch Top 5! https://searchenginewatch.com/2022/12/27/the-search-engine-watch-top-5/ Tue, 27 Dec 2022 10:00:32 +0000 https://www.searchenginewatch.com/?p=144387 First, congratulations on surviving 2022, you’ve done great! 2022 was surprising, unique, and a challenging mix of several global events that kept us on our toes as consumers, brands, and search marketing professionals. The recession, great resignation, a war, FIFA finale, and several silent battles we all fought by ourselves.

As we recap the year gone by, let’s look at the world through the lens of search, SEO, analytics, and content creation.

Source

2022 has been about…

  • Looking at your consumers as human beings and not just data sets
  • Understanding how your target consumers perceive the world and how they experience life in a digital age
  • Tailoring and testing your strategies to meet consumers in their moment of need – all without losing budget (or your sanity!)
  • Finding most-effective tools, technologies, and talent to navigate business uncertainty

We present to you the #SEWTop5

A countdown of editor’s picks that the Search Engine Watch community loved and found great value in!

#5. Understanding the three awareness stages of your online audience

Businesses often forget that success metrics aren’t just numbers – they are living, breathing people who are driven by behavior and emotions. As customer journeys continued to remain complex and multifaceted, businesses competed to ensure they were at the finish line when prospects were ready to convert.

Add People’s Content Operations Lead, Jack Bird created a guide on harnessing a content strategy that caters to consumers and their journeys. He detailed the three key awareness stages of online traffic, what type of content fits these stages, and how to audit your existing content.

#4. A must-have web accessibility checklist for digital marketers

Did you know, 98% of US-based websites aren’t accessible? This year web accessibility moved out of the shadows and took center stage as one of Google’s search ranking factors – making the topic itself more accessible to discussions. Marketers could no longer ignore this critical aspect, because –

Stellar user experience >> Positive brand perception >> Greater appeal to value-driven consumers = Good for business

Web design and marketing specialist, Atul Jindal created a must-have web accessibility checklist for digital marketers. It went beyond dispelling “what is web accessibility?” and spoke about its benefits and action points on “how to make your website accessible?”.

#3. Google Analytics 4: drawbacks and limitations—is it worth sticking around?

On July 1, 2023, Universal Analytics properties stopped processing new hits, forcing users to switch to its successor, Google Analytics 4. This transition demanded SEOs and marketers to have a steep learning curve and adaptability since the shift meant losing some historic data.

This article dove into the issues with Google Analytics 4 from a user perspective and a privacy and compliance standpoint. Objective, hard-hitting observations helped inform SEOs and marketers’ decisions before switching platforms.

#2. The not-so-SEO checklist for 2022

While most of the internet focused on “what to do”, we took an offbeat path of “what not to do” that will help your SEO succeed from the get-go.

Best-selling author and SEW Advisory Board Member, Kristopher (Kris) Jones dispelled some major myths surrounding Core Web Vitals (CWV) and Google’s bigger, mainstream 2021 updates.

As an especially interesting, strategy-focused read, this was one SEOs could not miss before designing their 2022 strategy.

#1. Seven Google alerts SEOs need to stay on top of everything!

We as SEOs and marketers often forget that while we focus on consumers and clients, we too are humans – with limited energy (we mean coffee supply), 24 hours (wish we had more), and sleep deprivation (yes we mean sleep deprivation). As burnout crept in and to-do lists climbed, our very own Ann Smarty shared seven Google alerts that aimed at making life easier for SEOs.

These smart ways helped the community get ahead of competition, prevent a reputation crisis, fix a traffic drop, and do much more (without getting overwhelmed).

We hope you enjoyed this! Thank you for being valuable supporters throughout our journey.

Team Search Engine Watch wishes everyone a happy new year! Keep spreading the love and SEO wisdom.

Via GIPHY

*Ranked on target audience engagement, time on page, and bounce rate.


Subscribe to the Search Engine Watch newsletter for insights on SEO, the search landscape, search marketing, digital marketing, leadership, podcasts, and more.

Join the conversation with us on LinkedIn and Twitter.

]]>
The ultimate 2022 Google updates round up https://searchenginewatch.com/2022/12/21/the-ultimate-2022-google-updates-round-up/ Wed, 21 Dec 2022 16:34:15 +0000 https://www.searchenginewatch.com/?p=144389

The ultimate 2022 Google updates round up

30-second summary:

  • 2022 saw nine confirmed updates (including two core updates,) five unconfirmed instances where volatility was observed in page rankings, and one data outage that caused chaos for 48 hours
  • Video and commerce sites were the biggest winners in the May core update, while reference and news sites lost out most, especially outlets without industry specificity
  • This theme largely continued and saw ripple effects from the helpful content update
  • What were these ebbs and flows, who won, who lost? Let’s find out!
  • Joe Dawson takes us through another round-up post that gives you the complete picture of Google’s moves

Only three things are certain in this life – death, taxes, and an industry-wide hubbub whenever Google launches an algorithm update. Like any year, 2022 has seen substantial changes in how the world’s largest search engine manages traffic and page rankings, with some businesses winning and others losing out.

Arguably the most significant change in 2022 is awareness of the rise of AI for content creation, becoming a hot topic in the world of marketing software. “Helpful content” updates have intended to bolster content written by human beings, penned with consumer needs in mind, over auto-generated articles designed to game the SEO system.

Has this been successful, or is the world of online marketing set for a rise of machines in 2023 and beyond? Similar to my last year’s column, let’s review the Google algorithm updates issued in 2022. I hope this helps you decide for yourself and build your business model around the latest developments in page ranking.

Complete list of 2022 Google updates

2022 has seen nine confirmed updates to Google’s algorithms, while an additional five instances of volatility were noticed and discussed by influential content marketing strategists across the year. We also saw one major data outage that caused a short-term panic! Let’s take a look at each of these updates in turn.

1) Unconfirmed, suspected update (January)

The core update of November 2021 was famously volatile, and just as web admins were coming to terms with a new status quo, further fluctuations were noted in early January 2021. Google remained tight-lipped about whether adjustments had been made to the algorithm, but sharp adjustments to SERPs were acknowledged across various industries.

2) Unconfirmed, suspected update (February)

Again, webmasters noticed a sudden temperature shift in page rankings in early February, just as things settled down after the January changes. While again unconfirmed by Google, these adjustments may have been laying the groundwork for the page experience update scheduled for later in the same month.

3) Page experience update (February)

Back in 2021, Google rolled out a page experience update designed to improve the mobile browsing experience. In February 2022, the same update was extended to encompass desktop browsing.

The consequences were not earth-shattering, but a handful of sites that previously enjoyed SERPs at the top of page one found their ranking drop. As with the mobile update, the driving forces behind the page experience update were performance measured against Google’s core web vitals.

4) Unconfirmed, suspected update (March)

Fluxes in page ranking and traffic were detected in mid-March, with enough chatter around the industry that Danny Sullivan, Public Liaison for Search at Google, felt compelled to confirm that he or his colleagues were unaware of any conscious updates.

5) Product reviews update (March)

March saw the first of three product review updates that would unfold throughout the year. As we’ll discuss shortly, ecommerce sites experienced a real shot in the arm throughout 2022 after the core updates, so this would prove to be a significant adjustment.

The fundamental aim of this product review update was to boost sites that offer more than just a template review of consumer goods – especially when linking to affiliates to encourage purchase. Best practice in product reviews following this update includes:

  • Detailed specifications beyond those found in a manufacturer description, including pros and cons and comparisons to previous generations of the same item.
  • Evidence of personal experience with a product to bolster the authenticity of the review, ideally in the form of a video or audio recording.
  • Multiple links to a range of merchants to enhance consumer choice, rather than the popular model of linking to Amazon.
  • Comparisons to rival products, explaining how the reviewed product stacks up against the competition – for good or ill.

The product review update did not punish or penalize sites that failed to abide by these policies, preferring to list a selection of items with brief (and arguably thin) copies to discuss their merits. However, sites, that offered more detail in their assessments quickly found themselves rising in the rankings.

6) Core update (May)

The first core update of the year is always a nerve-wracking event in the industry, and as always, there were winners and losers in May’s adjustments.

The most striking outcome of this update was just how many major names benefitted, especially in the realm of ecommerce, much to the delight ecommerce agencies around the world. Sites like Amazon, eBay, and Etsy saw considerable increases in traffic and prominence following the update, perhaps due to the product review update that unfolded two months prior.

Video sites also saw a spike in viewers and positioning following the May update. YouTube videos began outranking text articles while streaming services such as Disney Plus and Hulu rose to the top of many searches. Health sites began to see a slow and steady recovery after the May core update, for the first time since the rollout of 2018’s Medic update.

News and reference sites were the biggest losers in the May core update. News and media outlets suffered the most, especially those with a generic focus, such as the online arm of newspapers. Big hitters like Wikipedia and Dictionary.com were also pushed down the pecking order. Specialist sites that dedicate their reporting to a single area of interest fared a little better, but still took a hit in traffic and visibility.

7) Unconfirmed, suspected update (June)

Minor nips and tucks frequently follow when a major core update concludes. In late June, many webmasters started comparing notes on sharp changes in traffic and page ranking. Google failed to confirm any updates. These may have just been delayed aftershocks in the aftermath of May’s core update, but the industries that saw the biggest adjustments were:

  • Property and real estate
  • Hobbies and leisure
  • Pets and animal care

8) Unconfirmed, suspected update (July)

More websites saw a sharp drop in traffic in late July, especially blogs that lacked a prominent social media presence. SERPs for smaller sites were among the biggest losers in this unconfirmed update.

9) Product reviews update (July)

A minor tweak to March’s product review update was announced and rolled out in July, but caused little impact – while some review sites saw traffic drop, most were untouched, especially in comparison to changes at the start of the year.

10) Data center outage (August)

Not an update but a notable event in the 2022 SEO calendar. In early August, Google Search experienced an overnight outage. This was revealed to be caused by a fire in a data center in Iowa, in which three technicians were injured (thankfully, there were no fatalities.)

This outage caused 48 hours of panic and chaos among web admins, with page rankings undergoing huge, unexpected fluctuations, a failure of newly-uploaded pages to be indexed, and evergreen content disappearing from Google Search.

Normal service was resumed within 48 hours, and these sudden changes were reversed. All the same, it led to a great deal of short-term confusion within the industry.

11) Helpful content update (August)

The first helpful content update of 2022 saw significant changes to the SEO landscape – and may change how many websites operate in the future.

As the name suggests, this update is engineered to ensure that the most helpful, consumer-focused content rises to the top of Google’s search rankings. Some of the elements targeted and penalized during this update were as follows.

AI content An increasing number of sites have been relying on AI to create content, amalgamating and repurposing existing articles from elsewhere on the web with SEO in mind. On paper, the helpful content update pushed human-generated content above these computerized texts.
Subject focus As with the core update in May, websites that cover a broad range of subjects were likeliest to be hit by the helpful content update. Google has been taking steps to file every indexed website under a niche industry, so it’s easier for a target audience to find.
Expertise The EAT algorithm has been the driving force behind page rankings for a while now, and the helpful content update has doubled down on this. Pages that offer first-hand experience of their chosen subject matter will typically outrank those based on external research.
User behavior As a part of the helpful content update, Google is paying increasing attention to user behavior – most notably the time spent on a site. High bounce rates will see even harsher penalties in a post-helpful content update world.
Bait-and-switch titles If your content does not match your title or H2 headings, your site’s ranking will suffer. Avoid speculation, too. Attempts to gain traffic by asking questions that cannot be answered (for example, a headline asking when a new show will drop on Netflix, followed by an answer of, “Netflix has not confirmed when >TV show name< will drop”) suffered in this update.
Word stuffing Google has long denied that word count influences page ranking and advised against elongating articles for the sake of keyword stuffing. The helpful content update has made this increasingly important. 1,000 relevant words that answer a question quickly will outrank a meandering missive of 3,000 words packed with thin content.

12) Core update (September)

The second core update of 2022 unfolded in September, hot on the heels of the helpful content update.

This update repaired some of the damage for reputable reference sites that suffered in May, while those impacted by the unconfirmed update in June continued to see fluctuations in visibility – some enjoyed sharp uptakes, while others continued to hemorrhage traffic.

The biggest ecommerce brands continued to enjoy success following this update, while news and media outlets continued to plummet in visibility. Household names like CNN and the New York Post, for example, were hit very hard.

The fortunes of medical sites also continued to improve, especially those with government domains. Interestingly, the trend for promoting videos over prose was reversed in September – YouTube was the biggest loser overall.

13) Product reviews update (September)

A final tweak was made to the product reviews update in September as part of the core update, and it proved to be unpopular with many smaller sites, which saw a substantial drop in traffic and conversions. As discussed, it seems that 2022’s core updates have benefitted the biggest hitters in the market.

14) Spam update (October)

In October, Google rolled out a 48-hour spam update. This was an extension of the helpful content updates designed to filter out irrelevant and inexpert search results, in addition to sites loaded with malicious malware or phishing schemes.

Sites identified as potential spam during the update were severely penalized in terms of page ranking and, in some cases, removed from Google Search altogether. The most prominent targets of the update were:

  • Thin copy irrelevant to the search term, especially if auto-generated
  • Hacked websites with malicious or irrelevant redirects and sites that failed to adopt appropriate security protocols
  • Hidden links or excessive, unrelated affiliate links and pages
  • Artificial, machine-generated traffic

15) Helpful content update (December)

Early in December, Google began rolling out an update to August’s helpful content update. At the time of writing, it’s too early to announce what the impact of this has been. However, it promises to be an interesting time.

The August update faced criticism for being too sedate and failing to crack down hard enough on offending sites, especially those that utilize AI content and black-hat SEO tactics.

Many site owners will be crossing their fingers and toes that this update boosts genuine, human-generated copy created by and for a website’s target audience. The impact will become evident early in 2023.

This concludes the summary of 2022’s Google algorithm updates. It’s been an interesting – and frequently tumultuous – twelve months, and one that may set the tone for the years to come.

Google will always tweak and finesse its policies, and attempting to second-guess what Alphabet will do next is frequently a fool’s errand. All the same, it’s always helpful to check in with Google’s priorities and see which way the wind is blowing.


Joe Dawson is Director of strategic growth agency Creative.onl, based in the UK. He can be found on Twitter @jdwn.

Subscribe to the Search Engine Watch newsletter for insights on SEO, the search landscape, search marketing, digital marketing, leadership, podcasts, and more.

Join the conversation with us on LinkedIn and Twitter.

]]>
Is Google headed towards a continuous “real-time” algorithm? https://searchenginewatch.com/2022/11/03/is-google-headed-towards-a-continuous-real-time-algorithm/ Thu, 03 Nov 2022 16:07:44 +0000 https://www.searchenginewatch.com/?p=144261

Is Google headed towards a continuous “real-time” algorithm

30-second summary:

  • The present reality is that Google presses the button and updates its algorithm, which in turn can update site rankings
  • What if we are entering a world where it is less of Google pressing a button and more of the algorithm automatically updating rankings in “real-time”?
  • Advisory Board member and Wix’s Head of SEO Branding, Mordy Oberstein shares his data observations and insights

If you’ve been doing SEO even for a short while, chances are you’re familiar with a Google algorithm update. Every so often, whether we like it or not, Google presses the button and updates its algorithm, which in turn can update our rankings. The key phrase here is “presses the button.” 

But, what if we are entering a world where it’s less of Google pressing a button and more of the algorithm automatically updating rankings in “real-time”? What would that world look like and who would it benefit? 

What do we mean by continuous real-time algorithm updates?

It is obvious that technology is constantly evolving but what needs to be made clear is that this applies to Google’s algorithm as well. As the technology available to Google improves, the search engine can do things like better understand the content and assess websites. However, this technology needs to be interjected into the algorithm. In other words, as new technology becomes available to Google or as the current technology improves (we might refer to this as machine learning “getting smarter”) Google, in order to utilize these advancements, needs to “make them a part” of its algorithms.

Take MUM for example. Google has started to use aspects of MUM in the algorithm. However, (at the time of writing) MUM is not fully implemented. As time goes on and based on Google’s previous announcements, MUM is almost certainly going to be applied to additional algorithmic tasks.  

Of course, once Google introduces new technology or has refined its current capabilities it will likely want to reassess rankings. If Google is better at understanding content or assessing site quality, wouldn’t it want to apply these capabilities to the rankings? When it does so, Google “presses the button” and releases an algorithm update. 

So, say one of Google’s current machine-learning properties has evolved. It’s taken the input over time and has been refined – it’s “smarter” for lack of a better word. Google may elect to “reintroduce” this refined machine learning property into the algorithm and reassess the pages being ranked accordingly.    

These updates are specific and purposeful. Google is “pushing the button.” This is most clearly seen when Google announces something like a core update or product review update or even a spam update. 

In fact, perhaps nothing better concretizes what I’ve been saying here than what Google said about its spam updates

“While Google’s automated systems to detect search spam are constantly operating, we occasionally make notable improvements to how they work…. From time to time, we improve that system to make it better at spotting spam and to help ensure it catches new types of spam.” 

In other words, Google was able to develop an improvement to a current machine learning property and released an update so that this improvement could be applied to ranking pages. 

If this process is “manual” (to use a crude word), what then would continuous “real-time” updates be? Let’s take Google’s Product Review Updates. Initially released in April of 2021, Google’s Product Review Updates aim at weeding out product review pages that are thin, unhelpful, and (if we’re going to call a spade a spade) exists essentially to earn affiliate revenue.

To do this, Google is using machine learning in a specific way, looking at specific criteria. With each iteration of the update (such as there was in December 2021, March 2022, etc.) these machine learning apparatuses have the opportunity to recalibrate and refine. Meaning, they can be potentially more effective over time as the machine “learns” – which is kind of the point when it comes to machine learning. 

What I theorize, at this point, is that as these machine learning properties refine themselves, rank fluctuates accordingly. Meaning, Google allows machine learning properties to “recalibrate” and impact the rankings. Google then reviews and analyzes and sees if the changes are to its liking. 

We may know this process as unconfirmed algorithm updates (for the record I am 100% not saying that all unconfirmed updates are as such). It’s why I believe there is such a strong tendency towards rank reversals in between official algorithm updates. 

It’s quite common that the SERP will see a noticeable increase in rank fluctuations that can impact a page’s rankings only to see those rankings reverse back to their original position with the next wave of rank fluctuations (whether that be a few days later or weeks later). In fact, this process can repeat itself multiple times. The net effect is a given page seeing rank changes followed by reversals or a series of reversals.  

across the board fluctuations - Google moving towards a “real-time” algorithm

A series of rank reversals impacting almost all pages ranking between position 5 and 20 that align with across-the-board heightened rank fluctuations 

This trend, as I see it, is Google allowing its machine learning properties to evolve or recalibrate (or however you’d like to describe it) in real-time. Meaning, no one is pushing a button over at Google but rather the algorithm is adjusting to the continuous “real-time” recalibration of the machine learning properties.

It’s this dynamic that I am referring to when I question if we are heading toward “real-time” or “continuous” algorithmic rank adjustments.

What would a continuous real-time google algorithm mean? 

So what? What if Google adopted a continuous real-time model? What would the practical implications be? 

In a nutshell, it would mean that rank volatility would be far more of a constant. Instead of waiting for Google to push the button on an algorithm update in order to rank to be significantly impacted as a construct, this would simply be the norm. The algorithm would be constantly evaluating pages/sites “on its own” and making adjustments to rank in more real-time. 

Another implication would be a lack of having to wait for the next update for restoration. While not a hard-fast rule, if you are significantly impacted by an official Google update, such as a core update, you generally won’t see rank restoration occur until the release of the next version of the update – whereupon your pages will be evaluated. In a real-time scenario, pages are constantly being evaluated, much the way links are with Penguin 4.0 which was released in 2016. To me, this would be a major change to the current “SERP ecosystem.” 

I would even argue that, to an extent, we already have a continuous “real-time” algorithm. In fact, that we at least partially have a real-time Google algorithm is simply fact. As mentioned, In 2016, Google released Penguin 4.0 which removed the need to wait for another version of the update as this specific algorithm evaluates pages on a constant basis. 

However, outside of Penguin, what do I mean when I say that, to an extent, we already have a continuous real-time algorithm? 

The case for real-time algorithm adjustments

The constant “real-time” rank adjustments that occur in the ecosystem are so significant that they refined the volatility landscape. 

Per Semrush data I pulled, there was a 58% increase in the number of days that reflected high-rank volatility in 2021 as compared to 2020. Similarly, there was a 59% increase in the number of days that reflected either high or very high levels of rank volatility: 

Data showing volatility - Google moving towards a “real-time” algorithm

Simply put, there is a significant increase in the number of instances that reflect elevated levels of rank volatility. After studying these trends and looking at the ranking patterns, I believe the aforementioned rank reversals are the cause. Meaning, a large portion of the increased instances in rank volatility are coming from what I believe to be machine learning continually recalibrating in “real-time,” thereby producing unprecedented levels of rank reversals. 

Supporting this is the fact (that along with the increased instances of rank volatility) we did not see increases in how drastic the rank movement is. Meaning, there are more instances of rank volatility but the degree of volatility did not increase. 

In fact, there was a decrease in how dramatic the average rank movement was in 2021 relative to 2020! 

Why? Again, I chalk this up to the recalibration of machine learning properties and their “real-time” impact on rankings. In other words, we’re starting to see more micro-movements that align with the natural evolution of Google’s machine-learning properties. 

When a machine learning property is refined as its intake/learning advances, you’re unlikely to see enormous swings in the rankings. Rather, you will see a refinement in the rankings that align with refinement in the machine learning itself. 

Hence, the rank movement we’re seeing, as a rule, is far more constant yet not as drastic. 

The final step towards continuous real-time algorithm updates

While much of the ranking movement that occurs is continuous in that it is not dependent on specific algorithmic refreshes, we’re not fully there yet. As I mentioned, much of the rank volatility is a series of reversing rank positions. Changes to these ranking patterns, again, are often not solidified until the rollout of an official Google update, most commonly, an official core algorithm update. 

Until the longer-lasting ranking patterns are set without the need to  “press the button” we don’t have a full-on continuous or “real-time” Google algorithm. 

However, I have to wonder if the trend is not heading toward that. For starters, Google’s Helpful Content Update (HCU) does function in real-time. 

Per Google

Our classifier for this update runs continuously, allowing it to monitor newly-launched sites and existing ones. As it determines that the unhelpful content has not returned in the long-term, the classification will no longer apply.”

How is this so? The same as what we’ve been saying all along here – Google has allowed its machine learning to have the autonomy it would need to be “real-time” or as Google calls it, “continuous”: 

This classifier process is entirely automated, using a machine-learning model.” 

For the record, continuous does not mean ever-changing. In the case of the HCU, there’s a logical validation period before restoration. Should we ever see a “truly” continuous real-time algorithm, this may apply in various ways as well. I don’t want to let on that the second you make a change to a page, there will be a ranking response should we ever see a “real-time” algorithm.

At the same time, the “traditional” officially “button-pushed” algorithm update has become less impactful over time. In a study I conducted back in late 2021, I noticed that Semrush data indicated that since 2018’s Medic Update, the core updates being released were becoming significantly less impactful.

the relation between Google's updates and rank volatility - Google moving towards a “real-time” algorithm

Data indicates that Google’s core updates are presenting less rank volatility overall as time goes on

Subsequently, this trend has continued. Per my analysis of the September 2022 Core Update, there was a noticeable drop-off in the volatility seen relative to the May 2022 Core Update

lesser rank volatility seen during Google's core update in Sep 2022 - Google moving towards a “real-time” algorithm

Rank volatility change was far less dramatic during the September 2022 Core Update relative to the May 2022 Core Update 

It’s a dual convergence. Google’s core update releases seem to be less impactful overall (obviously, individual sites can get slammed just as hard) while at the same time its latest update (the HCU) is continuous. 

To me, it all points towards Google looking to abandon the traditional algorithm update release model in favor of a more continuous construct. (Further evidence could be in how the release of official updates has changed. If you look back at the various outlets covering these updates, the data will show you that the roll-out now tends to be slower with fewer days of increased volatility and, again, with less overall impact). 

The question is, why would Google want to go to a more continuous real-time model? 

Why a continuous real-time google algorithm is beneficial

A real-time continuous algorithm? Why would Google want that? It’s pretty simple, I think. Having an update that continuously refreshes rankings to reward the appropriate pages and sites is a win for Google (again, I don’t mean instant content revision or optimization resulting in instant rank change).

Which is more beneficial to Google’s users? A continuous-like updating of the best results or periodic updates that can take months to present change? 

The idea of Google continuously analyzing and updating in a more real-time scenario is simply better for users. How does it help a user looking for the best result to have rankings that reset periodically with each new iteration of an official algorithm update? 

Wouldn’t it be better for users if a site, upon seeing its rankings slip, made changes that resulted in some great content, and instead of waiting months to have it rank well, users could access it on the SERP far sooner? 

Continuous algorithmic implementation means that Google can get better content in front of users far faster. 

It’s also better for websites. Do you really enjoy implementing a change in response to ranking loss and then having to wait perhaps months for restoration? 

Also, the fact that Google would so heavily rely on machine learning and trust the adjustments it was making only happens if Google is confident in its ability to understand content, relevancy, authority, etc. SEOs and site owners should want this. It means that Google could rely less on secondary signals and more directly on the primary commodity, content and its relevance, trustworthiness, etc. 

Google being able to more directly assess content, pages, and domains overall is healthy for the web. It also opens the door for niche sites and sites that are not massive super-authorities (think the Amazons and WebMDs of the world). 

Google’s better understanding of content creates more parity. Google moving towards a more real-time model would be a manifestation of that better understanding.

A new way of thinking about google updates

A continuous real-time algorithm would intrinsically change the way we would have to think about Google updates. It would, to a greater or lesser extent, make tracking updates as we now know them essentially obsolete. It would change the way we look at SEO weather tools in that, instead of looking for specific moments of increased rank volatility, we’d pay more attention to overall trends over an extended period of time. 

Based on the ranking trends we already discussed, I’d argue that, to a certain extent, that time has already come. We’re already living in an environment where rankings fluctuate far more than they used to and to an extent has redefined what stable rankings mean in many situations. 

To both conclude and put things simply, edging closer to a continuous real-time algorithm is part and parcel of a new era in ranking organically on Google’s SERP.


Mordy Oberstein is Head of SEO Branding at Wix. Mordy can be found on Twitter @MordyOberstein.

Subscribe to the Search Engine Watch newsletter for insights on SEO, the search landscape, search marketing, digital marketing, leadership, podcasts, and more.

Join the conversation with us on LinkedIn and Twitter.

]]>
The new YMYL guidelines and what this means for marketers https://searchenginewatch.com/2022/09/08/the-new-ymyl-guidelines-and-what-this-means-for-marketers/ Thu, 08 Sep 2022 14:04:27 +0000 https://www.searchenginewatch.com/?p=144107

The new YMYL guidelines and what this means for marketers

30-second summary:

  • Your money or your life (YMYL) guidance has been updated to give more clarity on what Google is looking for within its quality rater guidelines
  • Focusing on reputation, both of the person creating the main content and the website hosting the main content, is key
  • YMYL trust isn’t just built on-site, off-site digital PR and link acquisition can also play a key role in building trust
  • Google also helped to clarify which websites/content might fall into the YMYL categories and how this is defined
  • E-A-T continues to play an important role across the board, alongside matching user intent and purpose and creating great, reputable content for users

In late July, Google updated its Page Quality Rater Guidelines. It does this from time to time to reinforce the key principles that it looks for when evaluating the quality of a page. While Google has held the concept of expertise, authority and trust close to the center of these guidelines for a long time, one of the major changes or updates was related to the definition of “Your Money or Your Life” websites. There was also more insight into how these pages are rated, which is ideal for anyone working in these sectors looking to better understand how Google rates their websites.

The concept of Google having very high Page Quality rating guidelines for ‘Your Money or Your Life’ (YMYL) websites isn’t new, but the definition of what falls into this category has changed. Previously the definition covered “pages (which) could potentially impact the future happiness, health, financial stability, or safety of users.” This has been updated to cover “pages (which) have a high risk of harm because content about these topics could significantly impact the health, financial stability, or safety of people, or the welfare or well-being of society.” This is a much broader scope of websites with potentially a much more significant impact.

As such, for many SEOs this means re-examining the guidance to ensure that our websites are ready for potentially enhanced scrutiny.

So what are the new guidelines and what does it mean?

The new guidelines for YMYL go much further than just the definition update above. They actually go into detail around how a particular topic could and couldn’t fall into the YMYL categorization, Google has even put it in a handy table for us so we can clearly understand:

Google Search Quality Evaluator Guidelines - YMYL

Source: Google Search Quality Evaluator Guidelines

It’s also not just YMYL categories that have seen the updates, but many elements that go into rating YMYL pages. Along with enhancements to key E-A-T definitions and what Google is looking for, we can also see key updates to sections that focus on “low-quality pages” or what we should try to avoid. As marketers, we’ve never had so much information available to us about what Google is looking for in a quality website. This means that Google is likely to be getting very serious about its Page Quality Rater Guidelines and as SEOs, we should be too.

Content is as important as ever

Content will already be at the forefront of many SEO minds given that Google’s “Helpful Content” update has already started rolling out. Additionally, the updates to the YMYL guidance have demonstrated that your on-site content is a key contributor to how the pages are evaluated for expertise, quality, and authority.

Google highlights in section 4.2 that the “quality of the MC is one of the most important criteria in Page Quality rating.” So we know that the main content on the website is something Google is looking at with close scrutiny, especially if your website falls into that YMYL category. Having a reasonable amount of good quality main content plays a key role in this, but so do the page’s functionality and features. Don’t just rest at making sure your content is great, ensure that any features on the website such as calculators, checkouts, and interactivity are also created to a high standard.

Content that falls into YMYL sectors is, of course, held here to a higher standard. Google gives the example that, “high E-A-T medical advice or information should be written or produced in a professional style and should be edited, reviewed and updated, on a regular basis.”

If you find yourself in a YMYL category, then regularly updating, reviewing, and editing your content to ensure that it’s up to date will play a role here.

Enhancing key E-A-T signals

For most businesses refreshing your ‘About Us’ page might seem like the most unimportant task, but when you are trying to tell users about who you are, showcase your expertise and give users that sense of trust and security, this can actually be one of the most important elements of your website. In section 2.5.3 of the guidelines, Google highlights that this can be one area of your website where raters go to find information about who owns the site, which can be a key element of establishing a good reputation.

Your reviews also fall into this category and that’s not just reviews on your own website, but also reviews on external sources. In fact – the word “reviews” is mentioned 66 times in the guidelines alone. While reviews on your own website are important and it’s definitely worth promoting these, one tip I picked up from the guidelines (section 2.6.4) is to do a quick reputation search. You can then evaluate if there are any other external website reviews or reputation signals that you need to be aware of. You can do this by using a negative site search i.e. for Google you would use [google -site:google.com] which would search for the term “Google” on all sites except google.com. Doing this for your business can help identify how others may view your reputation.

Reputation matters

Two of the five most important factors in Page Quality Rating relate to reputation and information; that is, information about who is responsible for the main content and the reputation of that person and the website itself..-We knew from the Medic Update that authoring and author profiles have grown in importance, and as the guidelines now turn to focus on the reputation of both the websites and the authors, this has become an even more important facet of showcasing your expertise and authority.

In sections 2.6 and 2.6.1 of the updated guidelines, Google talks about reputation research around both the user and the website which has provided the main content. It also talks about the type of reputation information that is available and how applicable it is within certain industries, for example, how applicable product reviews would be in the finance sector. It’s clear that building strong reputation information that is relevant to your brand/industry would add value here.

Finally, for websites that are smaller or perhaps don’t have a huge amount of visible reputation information, Google does state that “this is not indicative of positive or negative reputation… for these smaller businesses and organizations, lack of reputation should not be considered an indication of low page quality.”

Trust is built on-site and off-site

Trust and authority are two of the key elements which go into rating a page’s quality and these are key for great YMYL. However, this doesn’t just come down to content and updates on the site, it’s also very much about what is available off-site. Digital PR has seen unprecedented growth in recent years as a great way of growing a website’s reputation as well as building high-quality, authoritative backlinks back to a website.

Whether it’s looking for reputation information or key signals about your brand, one of the biggest places people are searching is on websites that aren’t yours. That’s where digital PR can have the biggest impact on improving your reputation, expertise, and overall authority. Digital PR can help to build your website and your author reputation by sharing thought leadership or data expertise. This is a great way to build up these core YMYL factors while also gaining great coverage for your brand.

Keeping the user in mind

Regardless of whether you are looking to devise a digital PR strategy, improve your on-site content or make changes to the structure of your website, with the new guideline updates and YMYL changes, it’s clear that Google wants to see and understand the reputation of your website and its content creators.

Keeping these elements and the user in mind will help to ensure that you’re creating a great user experience that naturally demonstrates expertise, authority, trust, and any other signals that Google is looking for. As Google continues to improve and update its guidelines, this will become more important than ever.


Amanda Walls is the founder and Director of Cedarwood Digital, an award-winning Digital Marketing agency specializing in SEO, PPC, and Digital PR.

With 12 years of Digital Marketing experience under her belt, Amanda founded the business six years ago which was recently named the UK Small Ecommerce Agency of the year in 2021.

An expert in all things digital, Amanda has worked as a trainer for Google’s Digital Garage in the North West and has delivered digital marketing training to thousands of marketers across the region.

Subscribe to the Search Engine Watch newsletter for insights on SEO, the search landscape, search marketing, digital marketing, leadership, podcasts, and more.

Join the conversation with us on LinkedIn and Twitter.

]]>
How to drive B2B conversions from your organic traffic https://searchenginewatch.com/2022/06/03/how-to-drive-b2b-conversions-from-your-organic-traffic/ Fri, 03 Jun 2022 17:17:24 +0000 https://www.searchenginewatch.com/?p=143900

How to drive B2B conversions from your organic traffic

30-second summary:

  • B2B conversion funnels are long and unpredictable, and your SEO strategy should reflect that
  • Because it takes several touchpoints for a buying decision to be made, a B2B SEO strategy should focus on both informational and commercial phrases
  • Brand-driven search is crucial for your conversions because B2B customers tend to careful consider all options
  • While optimizing for informational queries is important, make sure you have distinct conversion paths on each page
  • Create consistent visual identity across on- and off-site channels to improve brand recognizability at each touchpoint

There’s one key difference between B2B and B2C conversions: B2B shopping is almost never spontaneous. It takes several decision makers (which are collectively referred to as a decision making unit or a DMU) to review several options and make a choice.

A B2B shopping journey can thus take weeks and months.

Obviously, the organic search optimization strategy should address that challenge ensuring that more of those clicks driven by organic positions result in leads and sales.

1. Create SEO-driven landing pages for both TOFU and MOFU parts of the sales funnel

Fundamentally, a B2B marketing funnel consists of three stages: top, middle and bottom. The final stage is where the final sale happens, and may take eight touchpoints (i.e. a potential customer seeing or interacting with the site in some way or another) for a buying decision to finalize.

Traditionally, when it comes to SEO, businesses tend to prioritize landing pages that drive direct sales. In B2B it is hardly possible because customers tend to make lots of searches prior to making a purchase.

This is why informational search queries (those driving top of the funnel) are as important in B2B as commercial queries are.

How-to queries

How-to queries are highly engaging because visitors tend to stay on the page while taking the steps in a tutorial.

These are also likely to be transactional queries that may drive conversions if you manage to solve the customer’s problem.

Filter your keyword lists to how-to queries and start your optimization efforts by providing useful instructions (where your product is included in a non-promotional context as part of the solution).

You can also use Google Search Console to find how-to queries your site is already ranking: Come up with a plan to improve your positions for those:

Keep a close eye on your (and competitors’) branded search queries through search bar suggestions

Google’s People Also Ask and Suggestions

Both People Also Ask and suggestions impact searching journeys because they show up while people search giving them more ideas.

Moreover, both are dynamic, that is, they change depending on what people are typing in the search box or what they choose to click.

Because both of these search features can change the direction in which your customers are heading, you need to keep a close eye on those and optimize for each relevant query and question that shows up there.

Make sure you actually search for each of your target keywords and make notes of People Also Ask results and how to best address them on your site. You can use your current FAQ or Knowledge Base or answer each question in a dedicated article, depending on how in-depth an answer should be.

2. Keep a close eye on your (and competitors’) branded search queries

Because B2B purchases usually require long-term investment and commitment, B2B customers tend to carefully consider and compare all possible options and alternatives before finally making a purchase.

This means your brand name will be searched a lot.

Your brand will also be searched alongside your competitors.

Keep a close eye on your (and competitors’) branded search queries through search bar suggestions

No wonder in B2B these queries are always popular:

  • Brand name alternatives
  • Brand name 1 vs Brand name 2

Treat your brand name as a keyword and keep optimizing your site for it. It is a never-ending process because your competitors are likely to be doing the same.

Keep in mind that your brand-driven search is the most important part of your customers’ buying journeys.

3. Plan and monitor your search-driven buying journeys

Once those searchers land on your site, what do they do from there? 

While optimizing for informational-intent queries is important, don’t forget to plan distinct conversion paths from those informational pages down into your sales funnel: Invite people to schedule a demo with you, sign up for a webinar or sign up for a free trial.

Make sure to take full advantage of your lead magnets and lead-qualifying surveys: These normally make the best conversion path from an informational page because they match search intent and provide more answers to the covered questions.

Lead magnets work best when they are contextual, for example, cheat sheets, checklists and flowcharts make it easier to implement how-to content. HubSpot is a prime example of contextual CTAs and lead magnets done well:

HubSpot lead magnet example

Additionally, make sure all your assets are visually branded: Your organic-search-driven visitors should be able to remember you so that your tool looks familiar at the next touchpoint. 

Use your logo as a watermark on all images, keep your colors consistent within your site and across your social media channels and make sure all your downloads (ebooks, whitepapers, and other resources) include your visual identity elements and links back to your site.

From there, make sure you know how to monitor those conversion paths. Google Analytics Behavior Flow is a great way to track where people tend to go once they land on a certain page. You can segment this report to users referred to your site from organic search:

Plan and monitor your search-driven buying journeys to drive conversions through your organic traffic

Don’t forget to use Facebook pixel to be able to retarget those organic search visitors on social media to generate more touchpoints. You can also use retargeting when running YouTube ads. Both will remind your past visitors of your brand and take them close to a conversion.

Conclusion

Converting your organic search traffic is always a challenge, especially in B2B niches where customers are not likely to commit to your product from the first visit. Yet, when you understand your goals better, a strategic approach will gradually improve your conversions and boost your lead generation efforts.


Ann Smarty is the Founder of Viral Content Bee, Brand and Community manager at Internet Marketing Ninjas. She can be found on Twitter @seosmarty.

Subscribe to the Search Engine Watch newsletter for insights on SEO, the search landscape, search marketing, digital marketing, leadership, podcasts, and more.

Join the conversation with us on LinkedIn and Twitter.

]]>
Three critical keyword research trends you must embrace https://searchenginewatch.com/2022/05/19/three-critical-keyword-research-trends-you-must-embrace/ Thu, 19 May 2022 13:05:41 +0000 https://www.searchenginewatch.com/?p=143879

Three critical keyword research trends you must embrace

30-second summary:

  • Exact-match keywords are useful for researching patterns and trends but not so much for optimization purposes
  • When optimizing for keywords, optimize for intent and solve problems, don’t just match your page to the keyword
  • Brand-driven keywords should be your top priority because you cannot control SERPs but you can rank assets that will drive people back to your site
  • Instead of focusing on keyword strings, research your niche entities and find the ways to associate your business with those through on-site content and PR/link building efforts

If you ask an SEO expert to name one SEO tactic that has changed the most over the years, they are likely to confidently answer “link building.” Some will point out to “technical tasks”, and very few will ever think of “keyword research.”

The truth is, most SEO tasks look completely different these days but few SEO experts have changed the fundamental way they do keyword research and optimize content for those keywords.

Yes, we seem to have finally left keyword density behind (unless Google forces it back) but fundamentally nothing has changed: We run keyword tools, find relevant keyword strings and use them as much as we can throughout a dedicated page.

In the meantime, Google’s understanding and treatments of keywords has changed completely.

1. Exact-match keywords are getting obsolete

Google has a long history of trying to understand search queries beyond matching word strings in them to the documents in the search index.

And they succeeded.

It started years ago with Hummingbird being first quietly introduced then officially announced in August of 2013.

Yet, few SEOs actually understood the update or realized how much of a change to everything they knew it was.

With Hummingbird Google made it clear that they were striving for a deeper understanding of searching journeys and that would ultimately fix all their problems. As they manage to know exactly what a searcher wants and learn to give them that, no fake signals or algorithm manipulations will impact their search quality.

Hummingbird was the first time Google announced they wanted to understand “things” instead of matching “strings of words.” In other words, with Hummingbird exact-match keyword strings started becoming less and less useful.

Then, after Hummingbird came BERT that helped Google to enhance its understanding of how people search. 

Exact match keywords becoming obsolete after the Google BERT updateImage source: Google

There’s a short but pretty enlightening video on the struggles and solutions of Google engineers trying to teach the machine to understand the obvious: What is it people mean when typing a search query?

That video explains the evolution of SEO perfectly:

  • Context is what matters
  • Google is struggling, yet slowly succeeding at understanding “context, tone and intention”
  • Search queries are becoming less predictable as more and more people talk to a search engine they way they think
  • Stop words do actually add meaning, and are often crucial at changing it.

The takeaway here: Keyword research tools are still useful. They help you understand the patterns: How people tend to phrase a query when looking for answers and solutions in your niche.

But those keywords with search volume are not always what people use to research your target topic. According to Google, people search in diverse, often unpredictable ways. According to Google, on a daily basis 15% of searches are ones Google hasn’t seen before.

Every day Google encounters 15% of completely new search queries. That’s how diverse searching behaviors are.

Moving away from keyword matching, Google strives to give complete and actionable answers to the query. And that’s what your SEO strategy should be aiming at doing as well.

Whatever keyword research process you’ve been using is likely still valid: It helps you understand the demand for certain queries, prioritize your content assets and structure your site.

It’s the optimization step that is completely different these days. It is no longer enough to use that word in the page title, description and headings.

So when creating an optimization strategy for every keyword you identify:

  • Try to figure out what would satisfy the search intent behind that query: What is it that searcher really looking for? A list? A video? A product to buy? A guide to follow? Even slight changes in a searchable keyword string (e.g. plural vs singular) can signal a searching intent you need to be aware of.
  • Search Google for that query and look through search snippets: Google is very good at identifying what a searcher needs, so they generate search snippets that can give you lots of clues.

Notice how none of the high-ranking documents has that exact search query included:

Ranking resources for diverse keywords vs exact match keywordsImage source: Screenshot made by the author

2. Branded keywords are your priority

More and more people are using search to navigate to a website, and there are several reasons for that:

  • A few strongest browsers allow people search from the address bar (those include Safari on both desktop and mobile and, obviously, Google Chrome)
  • People are getting used to voice searching, so they just speak brand names to perform a  search.

Ranking for branded keywords to funnel target audience to assets

Image source: Screenshot made by the author

In other words, your customers who likely know about your brand and are possibly ready to make a purchase – those hard-earned customers are forced to search for your brand name or for your branded query.

And what will they see?

It is astounding how many companies have no idea what comes up for their branded search, or how many customers they lose over poorly managed (or more often non-existent) in-SERP reputation management.

There are three crucial things to know about brand-driven search:

  • These are mostly high-intent queries: These searchers are typing your brand name intending to buy from you
  • These are often your existing, returning customers that tend to buy more than first-time customers
  • Both of the above factors make these your brands’ top priority.

And yet, you don’t have control over what people see when searching for your brand. In fact, monitoring and optimizing for those brand-driven queries is not a one-time task. It is there for as long as your brand exists.

  • Treat your brand name as a keyword: Expand it, optimize for it, monitor your site’s rankings
  • Identify deeper level problems behind your customers’ brand-driven searching patterns: What is it you can improve to solve problems behind those queries?

Identifying customer pain points for keyword researchImage source: Screenshot made by the author

Your branded search queries should become part of your sales funnel – everything from About page to product pages and lead magnets should capture those brand-driven opportunities.

In many cases, when you see a large amount of brand-driven keywords, you may need a higher level approach, like setting up a standalone knowledge base.

3. Entities are key

Entities are Google’s way to understand this world.

Entities are all proper names out there: Places, people, brands, etc.

Google has a map of entities – called Knowledge Graph – that makes up Google’s understanding of the world.

Entities help Google understand the context and the search intent.

Using entities and semantic searchImage search: The beginner’s guide to semantic search

Being Google’s entity means coming up in searches where you were implied but never mentioned:

Using Google entities for keyword researchImage source: Screenshot made by the author

Through entity associations, Google knows what any search is about.

Entities should be the core of your keyword research process: What are known entities is your niche and how do you associate your brand with those entities?

Conclusion

Search engine optimization is evolving fast, so it requires an agile strategy for brands to keep up. If you are doing keyword research the old, exact-match, way, your business is about 10 years behind!


Ann Smarty is the Founder of Viral Content Bee, Brand and Community manager at Internet Marketing Ninjas. She can be found on Twitter @seosmarty.

Subscribe to the Search Engine Watch newsletter for insights on SEO, the search landscape, search marketing, digital marketing, leadership, podcasts, and more.

Join the conversation with us on LinkedIn and Twitter.

]]>
Why we’re hardwired to believe SEO myths (and how to spot them!) https://searchenginewatch.com/2022/04/28/why-were-hardwired-to-believe-seo-myths-and-how-to-spot-them/ Thu, 28 Apr 2022 14:55:13 +0000 https://www.searchenginewatch.com/?p=143861 Give someone a fish and they’ll EAT for one day. Teach someone to fish and they’ll EAT for a lifetime. Yes, that’s an SEO pun. It’s also the goal of this article.

If you pop into either of the fantastic SEO communities on Twitter or LinkedIn, you’ll inevitably encounter some common SEO myths:

  • “Longer dwell time means a good user experience, so it must be a ranking factor”
  • “A high bounce rate indicates a bad user experience, so it must be bad for SEO”

Social media posts like these get tons of engagement. As a result, they amplify the myths we try to squash through repetition, false evidence, and faulty logic. The problem isn’t limited to social media, either. There are plenty of high-profile websites that package hypotheses as facts because readers eat them up.

These myths are a huge problem because they’re red herrings. They cause marketers to prioritize projects that won’t improve the content, user experience, or Google search performance.

So how can the SEO community rally around the truth? We can start by doing two things:

  1. SEOs must admit our personalities and professions hardwire us to believe myths. We have a deep desire for answers, control, and predictability, as well as a fierce distrust of Google.
  2. We need to recognize the psychological and environmental factors that influence our ability to sort fact from fiction.

So rather than busting individual myths, let’s ask ourselves “why?” instead. In other words, let’s learn to fish.

Internal reasons we believe SEO myths

Let’s dig into some internal factors, such as our thoughts and feelings, that influence our beliefs.

1. SEOs need structure and control

SEO is a fascinating branch of marketing because our performance is driven by a constantly evolving algorithm that we don’t control. In fact, there were more than 5,000 Google algorithm updates in 2021 alone.

In other words, SEOs live in a world of crippling dependency. Even the top-ranking signals that we know about can fluctuate based on the industry, query, or available content within Google’s index. For example, if you manage websites in the finance or health space, E-A-T is critical. If you publish news content, then recency is very important.

To gain a sense of structure and control, we look for more ways to influence outcomes. But there are two problems with that approach:

  • We overestimate the impact of individual ranking factors
  • We falsely believe something is a Google ranking factor that is not

Our need to amplify our own level of control is supported by psychology. A 2016 study revealed an individual’s need for structure made them more likely to believe in a conspiracy theory.

“The human tendency to recognize patterns even when none exist is shown to have applications in consumer behavior. The current research demonstrates that as one’s personal need for structure (PNS) increases (that is, requiring predictability and disfavoring uncertainty), false consumer pattern perceptions emerge.”

If you find yourself waffling between fact and fiction, don’t let your desire for control dictate your final decision.

2. The primal need to recognize patterns

The human brain is excellent at recognizing patterns. Throughout history, we’ve relied on that ability to make better decisions and ensure the survival of our species. Unfortunately, we’re so good at spotting patterns that we also fabricate them.

False pattern recognition has several drawbacks –

  • It might influence SEO decisions that could have a sitewide impact
  • If you overstate the connection publicly, others might misinterpret it as fact

An excellent example surfaced on Twitter recently. Google’s John Mueller was asked if adding too many links to your site’s main navigation could impact Google Discover traffic. The individual who asked the question ran several tests and saw positive results, but Mueller said it was merely an interesting correlation.

“I’d still go with ’unrelated’. As mentioned in our docs: Given the serendipitous nature of Discover, traffic from Discover is less predictable or dependable when compared to Search, and is considered supplemental to your Search traffic.”

Fortunately, this individual went straight to the source for an answer instead of publishing a case study that could have had serious implications for website navigation decisions.

3. Confirmation bias

It’s well-documented that people accept information that supports their beliefs and reject information that doesn’t. It’s a primordial trait that evolved when we began to form social groups. Early humans surrounded themselves with others who thought and acted the same way to ensure their survival.

One of the most famous confirmation bias studies comes from Stanford. For the study, researchers segmented students into two opposing groups based on their beliefs about capital punishment.

One group supported capital punishment and believed it reduced crime. The other opposed it and believed it had no impact on crime.

Each group was asked to react to two studies, one which supported their views, and one which contradicted them. Both groups found the study that aligned with their beliefs much more credible, and each became more entrenched in their original beliefs.

SEO practitioners are particularly prone to confirmation bias because we’re terrified of being wrong. We hypothesize, test, build, optimize, and iterate. If we’re wrong too often, we’ll waste time and money, and we could risk our reputation and our jobs.

We need to be right so badly that we may accept myths that confirm our beliefs rather than admit failure.

4. Lack of trust in Google

It’s safe to say most SEOs don’t trust Google. That has led to some of the longest-running SEO myths I could find. For example, even after seven years of repeated rejections from Google, many SEO experts still believe engagement is a ranking signal.

Here’s John Mueller shooting down the engagement myth in 2015:

“I don’t think we even see what people are doing on your website. If they are filling out forms or not, if they are converting and actually buying something… So if we can’t see that, then that is something we cannot take into account. So from my point of view, that is not something I’d really treat as a ranking factor.”

Nearly seven years later, in March 2022, John was asked the same question again, and his response was pretty much the same:

“So I don’t think we would use engagement as a factor.”

And yet, the SEOs piled on in the comments. I encourage you to read them if you want a sense of the intense level of mistrust. Essentially, SEOs overanalyzed Mueller’s words, questioned his honesty, and claimed he was misinformed because they had contradictory insider information.

5. Impostor syndrome

Even the most seasoned SEO professionals admit they’ve felt the pain of impostor syndrome. You can easily find discussions on Reddit, Twitter, and LinkedIn about how we question our own level of knowledge. That’s especially true in public settings when we’re surrounded by our peers.

Not long ago Azeem Ahmad and Izzie Smith chatted about impostor syndrome. Here’s what Izzie said:

“It’s really hard to put yourself out there and share your learnings. We’re all really afraid. I think most of us have this impostor syndrome that’s telling us we’re not good enough.”

This contributes to SEO myths in several ways. First, it erodes self-confidence, which makes individuals more prone to believe myths. Second, it prevents folks who might want to challenge inaccurate information from speaking out publicly because they’re afraid they’ll be attacked.

Needless to say, that enables myths to spread throughout the broader community.

The best way to combat impostor syndrome is to ensure SEO communities are safe and supportive of new members and new ideas. Be respectful, open-minded, and accepting. If more folks speak out when something doesn’t feel accurate, then we can keep some troublesome myths in check.

External reasons we believe SEO myths

Now let’s explore the external forces, like peers and publishers, that cause us to believe SEO myths.

1. Peer pressure

Peer pressure is closely related to impostor syndrome, except it comes from the outside. It’s a feeling of coercion from peers, whether a large group of SEOs, a widely known expert or a close mentor or colleague.

Because humans are social creatures, our urge to fit in often overpowers our desire to be right. When something doesn’t feel right, we go with the flow anyway for fear of being ostracized. In fact, social proof can be more persuasive than purely evidence-based proof.

I asked the Twitter SEO community if anyone ever felt compelled to accept an SEO ranking factor as fact based on popular opinion. Several folks replied, and there was an interesting theme around website code.

“Back in 2014, a web developer told me he truly believed text-to-code ratio was a ranking factor. For a while, I believed him because he made convincing arguments and he was the first developer I met who had an opinion about SEO.”

—  Alice Roussel

“Years and years ago I wanted code quality to be a ranking factor. Many thought it was because it made sense to reward well-written code. But it never was. Browsers had to be very forgiving because most sites were so badly built.”

—  Simon Cox

Similar to combatting impostor syndrome, if we develop a more tolerable SEO community that’s willing to respectfully debate issues, we’ll all benefit from more reliable information.

2. Outdated information

If you publish content about SEO, then you’ll be guilty of spreading SEO myths at some point. Google updates its algorithms thousands of times each year, which means assumptions are disproven and once-good advice becomes outdated.

Trusted publishers have a duty to refresh or remove inaccurate content to prevent SEO misconceptions from spreading.

For example, in 2019 Google changed how it handles outbound links. It introduced two new link attributes into the nofollow family, UGC and sponsored, and began to treat all three of these as hints instead of ignoring nofollow links.

So if you wrote about link attributes prior to September 2019, your advice is probably out of date.

Unfortunately, most SEOs update content because it’s underperforming, not because it’s wrong. So perhaps publishers should put integrity above performance to strengthen our community.

3. Jumping on trends

Sometimes SEO myths explode because the facts can’t keep up with the virality of the myth. One of my favorite examples is the LSI keyword trend. This one pops up on Twitter from time to time, and thankfully Bill Slawski is quick to quash it.

Trend-based myths go viral because they tap into the fear of missing out (FOMO), and SEOs hate to miss out on the opportunity to gain a competitive advantage. They also resonate with SEOs because they appear to offer a secret glimpse into Google’s black box.

Although trends eventually fade, they will remain a thorn in our side as long as the original sources remain unchanged.

4. Correlation vs causation

The most difficult myths to bust are those backed by data. No matter how many times Google debunks them, they won’t die if folks come armed with case studies.

Take exact match domains (EMD) for example. This article lists several reasons why EMDs are good for SEO, using Hotels.com as a case study. But it’s a classic chicken and egg argument. Does the site rank number one for “hotels” because it’s an EMD? Or is it because the owner clearly understood SEO strategy and prioritized keyword research, link building, internal links, page speed, and high-quality content marketing for the last 27 years?

We also can’t discount the fact that the domain has 42 million backlinks.

But if you want to hear it directly from the horse’s mouth, Google’s John Mueller says EMDs provide no SEO bonus. Here’s what he said on Reddit:

“There’s no secret SEO bonus for having your keywords in the domain name. And for those coming with “but there are keyword domains ranking well” — of course, you can also rank well with a domain that has keywords in it. But you can rank well with other domain names too, and a domain won’t rank well just because it has keywords in it.”

This is obviously correlation, not causation.

To be clear, I fully support running SEO tests to learn more about Google’s algorithm. But it’s incredibly difficult to create a signal vacuum that prevents outside influences from skewing your results. And even if you manage to isolate one ranking factor, you have no way of knowing how strong the signal is in relation to other signals. In a total vacuum, one signal may win. But in the wilderness of Google, it may be so weak that it’s virtually nonexistent.

Furthermore, the signal may only apply to certain types of content. We’ve seen signal fluctuations before regarding product reviews and E-A-T in YMYL spaces. So even if data suggests something might improve organic rankings, how reliable is the information, and how important is the signal?

All this is to say that we should be very careful when proclaiming new ranking factors, especially if they contradict Google’s statements or stray too far from universally measuring user experience.

5. It’s plausible, but not measurable

This group of myths is rooted in logic, which makes them particularly dangerous and sticky. Usually, they follow a simple formula: if A = B, and B = C, then A = C.

Here’s an example:

  • Google wants to rank content that provides a good user experience
  • If a webpage has a high bounce rate, it must provide a bad user experience
  • Therefore, a high bounce rate is bad for SEO

This seems to make sense, right? Yet, Google has said many times they can’t see what users do on your website, and they don’t look at bounce rate.

I’ve seen the same argument applied to dwell time, time on page, SERP click-through rates (CTR), and so on. To be clear, Google says CTR  does not drive organic search engine rankings because that would cause results to be overrun with spammy, low-quality content.

Most often these myths stem from competing views about what a good user experience looks like and how to measure it. What constitutes a good experience for one type of search query might be a terrible experience for another. This lack of consistency makes it virtually impossible to identify metrics that can be deployed universally across all websites.

In other words, if potential user experience signals depend on too many factors, Google can’t use them. That’s why they launched the page experience update in 2021 which quantifies user experience with specific, universal metrics.

Here’s your fishing pole

In many cases, SEO myths fall into more than one of the above categories which makes them even more difficult to dispel. That’s why we keep seeing social media posts falsely identifying ranking factors like keyword density, domain authority, conversions, and meta keywords.

If you understand a few basic concepts about ranking factors, you’ll be better equipped to sort fact from fiction and prioritize SEO initiatives that drive more organic traffic.

Ask yourself these five questions when you smell the stench of a myth:

  • Is it quantifiable and measurable?
  • Is it scalable?
  • Is it broadly or universally true, or does it depend on the user?
  • Does it support Google’s goals of delivering a better user experience?
  • Has Google confirmed or denied it publicly?

If you can check each of those boxes, then you may have a valid ranking factor on your hands. But don’t take my word for it. Run some tests, ask some friends, use logic, and confirm your theory. And if all else fails, just ask John Mueller.


Jonas Sickler is a published author and SEO manager at Terakeet. He writes about SEO, brand reputation, customer attention, and marketing. His advice has appeared in hundreds of publications, including Forbes, CNBC, CMI, and Search Engine Watch. He can be found on Twitter @JonasSickler.

Subscribe to the Search Engine Watch newsletter for insights on SEO, the search landscape, search marketing, digital marketing, leadership, podcasts, and more.

Join the conversation with us on LinkedIn and Twitter.

]]>
2022 Search ads 360 update: What you need to know https://searchenginewatch.com/2022/03/23/2022-search-ads-360-update-what-you-need-to-know/ Wed, 23 Mar 2022 07:31:27 +0000 https://www.searchenginewatch.com/?p=143813

2022 Search ads 360 update What you need to know

30-second summary:

  • Search Ads 360 platform has seen one of its biggest updates in 10 years
  • Performics’ Senior Media Manager, Alex Medawar shares key highlights of the updates around budget optimization, performance monitoring, and inventory management

Google recently announced a new update to its Search Ads 360 platform – and it’s a big one. SA360 has gotten even more powerful since it was first launched over 10 years ago, making it simpler than ever for commercial enterprises to manage their search advertising efforts.

A select number of Search Ads 360 users finally gained preview access this past month.

The new platform experience will start rolling out over the coming months while allowing users to continue access in the classic experience. In this article, I outline what’s new and share effective ways to make the most of your budgets and inventory in the Search Ads 360 platform.

1. Greater support for alternative channels

One of the Search Ads 360 updates includes greater support for alternative search engines such as Microsoft Ads and Yahoo! Japan.

Other advertising channels have been neglected for years, and the consequence has been time-consuming workarounds to link data and make bulk changes.

As a result of the new update, Google promises that you will now be able to get more of your work done from the same place.

For Microsoft Advertisers, SA360 will now support additional features:

  • Response search ads
  • Call extensions
  • Local inventory ads
  • Access to a variety of audience types

For Yahoo! Japan advertisers, you can now utilize dynamic search ads and site link extension scheduling.

2. Access to new features

Search Ads 360 will now offer support for the newest features in Google Ads including:

  • Performance Max – a new goal-based campaign type that lets performance advertisers access all of their Google Ads inventory from a centralized campaign. 
  • Discovery campaigns – allows advertisers to run ads in Google discover feeds to deliver highly visual, inspiring personalized ad experiences
  • Display and YouTube Advertising (previously only in the platform)

Google has also added advanced enterprise innovation features that will allow teams to scale everyday tasks such as:

  • Campaign management
  • Create automated rules
  • Use labels across various advertisers simultaneously

The addition of Templates will combine current features like inventory management and ad builder for a unified and scalable experience.

new features in the Search Ads 360 update - inventory management and ad builder

For media managers who spend hours crafting forecasts, the new Performance Center will include enterprise planning capabilities with spend, CPA, and conversion forecasts in the coming months.

3. Updated inventory management

A revamped inventory management system provides streamlined workflows and more powerful controls over how you use your ad space.

Utilizing an inventory feed, SA360 can generate ready-to-go paid search campaigns using dynamic data such as price, description, and availability of your product from your feed.

This tool is especially useful for industry verticals with frequently changing prices and availability such as:

  • Airlines
  • Hotels
  • Live and Streaming Entertainment
  • Recruitment
  • Retail
  • Businesses with geo-specific offers

How it works

  1. Provide high-quality data and make a list of attributes for your feed such as product name, price, and landing page.
  2. Create templates for each type of output you want generated such as a campaign, ad group, ad, or keyword. (Tip: start small!)
  3. Utilize functions and attributes to generate highly relevant ads.
  4. Check your output and optimize until you are happy with the results.

Within minutes, you’ll have ready-to-go, targeted campaigns in your account that ready for launch.

In the new Search Ads 360, marketers will be able to manage templates across client accounts to update ads at scale.

budget management in Search Ads 360

4. Budget management

Any media manager will tell you that managing account budgets and pacing is one of the most critical components of campaign management and also one of the most difficult, especially at scale.

As part of the latest Search Ads 360 release, budget management will be improved and integrated with the new ‘Performance Center’.

Later this year, Google plans to provide complete access to these planning tools, allowing you to experiment with a variety of potential media budget flighting scenarios.

Features

The following are some of the features included in the present budget management system:

  • Visual graphs that include target and estimated spend, plus KPIs such as CPA (cost per acquisition) or revenue
  • Automatic budget allocation and bid adjustments set by your chosen budget bid strategy
  • Forecasting capabilities based on historical performance data that factors in seasonality
  • Estimated cumulative spend and likelihood to hit target spend based on historical data
  • Pacing reports at the daily, weekly, monthly, quarterly, and annual level

As Google adds new features throughout the year, we can anticipate that these tools will become more accurate and streamlined for enterprise planning.

New look

The new Search Ads 360 experience closely resembles the Google Ads platform with similar navigation and a familiar user experience.

Upon launching the SA360 platform, you can see the identical account overview dashboard found in Google Ads for seamless navigation between the two.

Into the future

With the new Search Ads 360 update, Google opens doors for the next generation of enterprise innovations to optimize performance.

The new updates will help you get more work done in one place saving time and providing a better cross-channel view for data-driven decision making.

To learn all about the new tools, enroll in Google’s new Skillshop modules for Search Ads 360.


Alex Medawar is Senior Media Manager at Performics and creator of Alex Medawar.com. As a seasoned digital media expert, Alex Medawar focuses on B2B paid search campaign management and strategy for global brands in the tech space. Utilizing a data-driven approach, Alex believes that both small businesses and large enterprises alike can speak to their audience and drive results within the digital media landscape.

Subscribe to the Search Engine Watch newsletter for insights on SEO, the search landscape, search marketing, digital marketing, leadership, podcasts, and more.

Join the conversation with us on LinkedIn and Twitter.

]]>
Here’s what an ROI-worthy search advertising budget looks like in 2022 https://searchenginewatch.com/2022/03/17/heres-what-an-roi-worthy-search-advertising-budget-looks-like-in-2022/ Thu, 17 Mar 2022 12:08:21 +0000 https://www.searchenginewatch.com/?p=143805

Here’s what an ROI-worthy search advertising budget looks like in 2022

30-second summary:

  • Digital marketers experience a potential ROI tunnel vision when it comes to search advertising
  • Seriously, do you need to burn dollars over those high-competition keywords? Does it trickle down into actual business?
  • How do you not lose vision and outweigh the paid search cost with your revenue?
  • We’re bringing you the finer details of designing a paid media budget straight from an SEO expert and serial entrepreneur

It’s a bit of an understatement to say that success in digital marketing depends on a whole lot of things. There’s your skill-set, your team that helps you, and your understanding of the market where you’re trying to make a dent, either for yourself or your clients.

But how often do you think about your budget? Specifically, we’re talking about your search advertising budget here.

On its face, running paid media ads on Google Ads, the Google Display Network, Facebook, Microsoft, and other platforms is pretty simple: you bid on your keywords, define your target audiences, and run your ads for the length of the campaign.

You might not think that your budget factors into things beyond showing you the funds you have to work with, but I argue that there’s more to it than that, especially when every dollar counts and you have a potential tunnel vision on ROI.

The thing is, only you will be able to say ultimately what your ROI-worthy search advertising budget will look like this year, but in this article, I’ll explain how to design your paid media budget to strike gold in 2022.

The basics: What do you want?

So, you want to know what your search advertising budget should look like in 2022.

Let me ask you this first: who are you, how big is your business, how much do you have to devote to search advertising, and, most importantly of all, what do you want to accomplish?

There are so many factors here that only you will know, but the questions I’d ask myself if I were looking at designing a search advertising budget for 2022 would include:

  • What do I want out of my campaigns?
  • How many conversions can I reasonably expect to get from my campaigns?
  • Is search advertising my only growth channel right now, or are there others?
  • How much will I also be putting into SEO or email?
  • How can I track my search advertising to make sure my performance is what I expect?
  • What will success look like?

Your budget is going to reflect what you want out of your campaigns, and what you want should reflect what growth looks like to your business.

For instance, are you an affiliate-marketing blogger who just needs more eyeballs on your pages? Are you a law firm looking for real, honest form-fills? Are you an ecommerce brand that’s retargeting your audiences for products they’ve viewed?

All of it matters, because your approach to your search advertising, and consequently your budgeting, will be determined by your goals.

Closing in: What do you need?

After figuring out what you want, it’s time to think of what you need to get there. Here’s where we’ll talk about hard figures: budgeting.

Only you will know what your search advertising campaigns should be producing (the results ideally will be based on the goals you’ve laid out).

So, if you want to grow by, let’s say, $2,000 a month, then you need to do some math to get there.

How many leads does your current search advertising campaign bring in? Of those leads, how many convert? Knowing your conversion rate will be key, as will knowing what each lead is worth to you and what your cost per lead is.

When you figure these things out, you’ll have a better idea of how to budget.

If a conversion will bring you $500, and your cost per lead is $10, and your conversion rate is five percent, then you need to bring in 80 leads a month through search advertising.

Here’s how it works.

You need four conversions a month to hit your $2,000 goal. You convert five percent of the leads you get. Four is 5% of 80. You, therefore, need 80 leads per month to reach your goal.

And if you pay $10 per lead, then your budget should be $800 a month for search advertising.

Now, that’s an ideal situation. That’s assuming you can make it all happen consistently like that, month after month.

In the perfect world, that budget will indeed be ROI-worthy.

But campaigns may fail, certain methods may not follow through for you.

How can you ensure your budgeting and efforts are worthwhile?

Pulling it together: Get smart about bidding

You want to design an ROI-worthy search advertising budget for 2022. That means you want to be in the big leagues like your competitors. What do you think they’re doing that you aren’t? Do they have some insight into Google Ads that you don’t?

No, it really comes down to your keyword strategy for your ads.

In case you didn’t know, it works like this in SEO, too: the more mainstream, general, and competitive keywords – such as “SEO company” – are going to be pretty expensive to bid on. Depending on your budget, you may not be able to sustain that kind of campaign for long, and it’s going to end up as a lot of wasted dollars.

But again, look at your similarly sized competitors. They probably have roughly the same budget as you do. If they’re outperforming you, they may have a smarter keyword bidding strategy than you do.

Taking the example from above, maybe you don’t want or need to rank your ads for “SEO company.”

A longer-tail keyword such as “SEO agency for link building” will cost you less and have fewer monthly searches. But as in any sales funnel, when searchers get more specific, they tend to be more ready to convert.

Just remember that when you get more specific, you’re going to want to hone in on the quality and relevance of your ads’ corresponding landing pages.

A long-tail keyword search requires a long-tail ad, and a long-tail ad requires a long-tail landing page (so to speak). Be sure to deliver on what your ad promises. Surely, you can develop content related to hiring an SEO agency for link building.

Think of those funnels here. People want to see content related to where they are in the buyer’s journey. When they see it, they will be more ready to convert. It works the same in SEO.

If you want to talk about really honing in on ROI with your search advertising, that’s the way to do it.

What will you do next?

Many businesses spend between seven and 12 percent of their annual budget on marketing. It’s a necessary expenditure for growth.

If you want to make sure that whatever you spend on your search advertising this year is actually worthy of a satisfactory ROI, study the tips I have laid out. Know your strengths, what you can do, and your bidding limitations, as well.

If you’re smart, you can really build something great.


Kris Jones is the founder and former CEO of digital marketing and affiliate network Pepperjam, which he sold to eBay Enterprises in 2009. Most recently Kris founded SEO services and software company LSEO.com and has previously invested in numerous successful technology companies. Kris is an experienced public speaker and is the author of one of the best-selling SEO books of all time called, ‘Search-Engine Optimization – Your Visual Blueprint to Effective Internet Marketing’, which has sold nearly 100,000 copies.

Subscribe to the Search Engine Watch newsletter for insights on SEO, the search landscape, search marketing, digital marketing, leadership, podcasts, and more.

Join the conversation with us on LinkedIn and Twitter.

]]>