Google Penguin – Search Engine Watch https://searchenginewatch.com Tue, 18 Feb 2020 01:52:46 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 Best practices for anchor text optimization in 2018 https://searchenginewatch.com/2017/12/22/best-practices-for-anchor-text-optimization-in-2018/ https://searchenginewatch.com/2017/12/22/best-practices-for-anchor-text-optimization-in-2018/#respond Fri, 22 Dec 2017 13:44:45 +0000 https://www.searchenginewatch.com/2017/12/22/best-practices-for-anchor-text-optimization-in-2018/ Doing SEO for any client is intimately associated with getting the most out of every link.

Anchor text is an important element that “unlocks” every link’s potential — to the extent that Google had to roll out its first Penguin update in 2012, cutting tried-and-true anchor text over-optimization methods out of the picture.

Over the past five years, the best practices of anchor text optimization have considerably evolved. It is time to learn how anchor text best practices can allow you to get the most out of links in 2018.

Anchor text and Google Penguin

The release of Penguin 1.0 in April, 2012 shook up the SERPs, affecting around 3% of all search queries in English, German, Chinese, Arabic, and other popular languages. Since then, there have been at least five major Google Penguin updates:

Since Google releases its Penguin updates periodically, some SEO professionals and marketers take advantage of the gaps, pushing up SERPs with gray-hat anchor text practices (e.g. targeted anchor texts, lower-quality link-building), and then get penalized for doing so.

When it comes to the relationship between anchor texts and Google Penguin updates, the rule of thumb is simple: Follow Google’s guidelines and avoid trying to hack the system by using overly aggressive anchor text practices. Sooner or later, Google will come up with a new update, which will negatively affect SERPs.

Major anchor text categories

Before providing specific tips on anchor text optimization, let’s recap the major categories of anchor text:

  • Branded — your brand name with a link placed on it (e.g. Search Engine Watch)
  • Naked URL — your site’s URL with the link it is pointing to (e.g. https://searchenginewatch.com/)
  • Website Name — your site’s URL with the anchor text written as “YourWebsite.com” (e.g. sewprod.wpenginepowered.com)
  • Page/Blog Post Title — a page’s title anchor text with a link on it (e.g. How to future-proof your SEO for 2018)
  • Exact-match Keywords — a targeted keyword with a link on it (e.g. Tips for entrepreneurs)
  • Partial-match Keywords — a targeted keyword plus some other text with a link on it (e.g. Beginner tips for entrepreneurs, tips for entrepreneurs guide)
  • LSI Keywords — a keyword anchor text that is related to a targeted keyword (e.g. entrepreneurship tips, business tips for entrepreneurs, startup business success stories)
  • No Text — an image with a link on it
  • Generic (e.g. Click this link, Read more, Check this out)

Best practices for anchor text optimization

Keep it natural… and versatile

According to Google, every part of any website, including links and their associated anchor text, needs to provide real value to users. Links must be put only where users expect to see them, so they can get informed about something valuable to them.

With Google’s algorithms getting smarter every year, you should avoid multiple repetitive and keyword-based anchors in your site’s anchor text cloud. Failure to do so will definitely result in a penalty.

To quote Neil Patel:

“I like building natural links, because that’s what Google wants. You can’t be smarter than the engineers who spend their workdays making the algorithm work smarter. So, stay off Google’s radar, focus on high-quality content and avoid a penalty on Google and other search engines.”

Of course, you need to link to high-quality, relevant pages and disavow all links from low-quality, non-relevant web pages. Getting links from sites with high Domain Authority, Page Authority, and Trust Flow is also a must.

Avoid over-optimization

Google does not appreciate overly-rich anchor text. A spammy, keyword-based anchor text cloud is a big red flag to Google. It indicates blunt manipulation with backlinks, which, obviously, results in penalties.

Instead, try to keep your anchor text natural by spreading it across your inbound links in the right proportions (more about this below). For instance, instead of placing “Software development company” in every guest post, try using something like “companies that develop software” or “the most reliable software development firms,” etc.

Keep anchors relevant to content

As time goes on, Google will only improve its algorithms responsible for understanding the actual meaning of a web page’s content. Since 2015, it has been testing DeepMind, a natural language processing technology that allows artificial intelligence to learn just as humans would.

Provided Google knows what is put on a concrete web page, it will not have any problems figuring out if a specific anchor text or link is relevant to a web page’s content.

If you place an internal link with irrelevant anchor text on your own website, this is likely to harm your search ranking. The is true for backlinks with irrelevant anchor text.

Google is obsessed with improving user experiences. It tries its best to provide relevant content in the most convenient manner. Clearly, non-relevant anchors with non-relevant links behind them lead users to non-relevant content, which Google does not appreciate.

Engage in relevant guest blogging

The relevance of the anchor text is one of key factors of a successful, cost-efficient guest blogging campaign, or of any healthy anchor text cloud for that matter.

What it comes down to is this: If you are guest posting with the intention of pushing up your “Digital marketing tips” keyword, place links to pages that include information about digital marketing, with exact-match, partial-match, and LSI (Latent Semantic Indexing) keywords featuring the topic of discussion. Obviously, your “Digital marketing tips” anchor text, with an associated link, should not be put on websites that have nothing to do with digital marketing.

Note: Do your best to use LSI and partial-match anchors in your guest post. In this way, you will achieve a more natural-looking anchor text cloud and satisfy the Google gods.

Avoid links from and linking to spammy sites

While the first part of this one is self-descriptive (you should never build links from low-quality websites), it is not common knowledge that Google pays close attention to websites you link to as well. Actually, since the release of Google’s Hummingbird update, this type of co-citation can play a key role in calculating your site’s SERP placement.

Check your outbound links to make sure you steer clear of low-quality sites. Even though you can get paid or rewarded with a couple of reciprocal links, linking to a toxic website has the potential to ruin your site’s authority and rank in the long run.

Distribute anchors in the right proportions

While the “right proportions” part is always up for debate, it is pretty much indisputable that you should:

  • Avoid stuffing your anchor text cloud with exact-match and partial-match keywords by all means
  • Rely on branded and website name anchor texts (as they are allowed by Google and other search engines)
  • Sparsely use Page Title/Blog Post Title anchor texts (Adam White of Search Engine Journal claims that this is the single best anchor text for SEO)

So, what are the right proportions?

While the safe answer is, “It depends,” — some recommendations do exist. According to at least a couple of anchor text case studies, the golden formula is:

  • 50% — Branded anchor texts
  • 15% — WebsiteName.com
  • 10-20% — Naked URL
  • 10-15% — Page Title/Blog Post Title
  • 1-5% — Generic anchor texts
  • 1-5% — Exact- and partial match keywords
  • Other

But, once again, make sure that you do a thorough analysis of your niche and competitors. Your first priority is to reverse-engineer the anchor text cloud of websites ranked at the top, and only then can you start adjusting your website’s anchor text cloud.

Focus anchors on deep-level pages

One of the most common mistakes that beginner SEO professionals make is focusing the anchors they build on top-level pages, mainly placing links to a homepage, landing pages, or even concrete product pages.

An anchor text cloud that is purely built around these shallow pages does not look natural to Google and other search engines, simply because people do not naturally place links in that way. As a rule, they link to worthy shareable content like blog posts.

What you should do is focus your anchors on relevant, deep-level pages. Not only will you create a natural, versatile anchor text cloud, but you will also allow visitors to navigate to top-level pages.

Place anchors where users pay the most attention

This is more of a psychology-type tip.

Since users often do not read but rather skim pages, a page’s first few paragraphs, its headings, subheadings, and imagery become focal points — people pay more attention there. Thus, it makes sense to put your anchor texts next to these “hot” parts of a page in order to increase click-through rates and engagement.

Do not be overly obsessed with this one, though. If users find concrete anchor text to be descriptive and potentially valuable, they will click the link to check out what’s inside, one way or another.

Conclusion

Anchor text optimization practices evolve over time. As most of them get adjusted in line with the Penguin updates, pay close attention to keeping your anchor text cloud natural and versatile, which is the first point of interest to Google.

“Organic” anchor text distribution influenced by averages for a targeted niche, and specifically for your competitor’s websites, plays a huge role, but keep low-quality links in mind. If your anchor texts are up to snuff, do a complete audit of incoming links to sift out and disavow those coming from untrustworthy, non-relevant websites.

To sum it up, you need to remain on the right side of Google, one way or another. Specifically, do not try to game the system — it will not work in the long run. Instead, make sure that your anchor text is natural (avoid over-optimization, use relevant anchors, do not link to low-quality websites), and use keyword-rich links once in a while to help you rank.

]]>
https://searchenginewatch.com/2017/12/22/best-practices-for-anchor-text-optimization-in-2018/feed/ 0
A Penguin 4.0 recovery case study https://searchenginewatch.com/2016/11/08/a-penguin-4-0-recovery-case-study/ https://searchenginewatch.com/2016/11/08/a-penguin-4-0-recovery-case-study/#respond Tue, 08 Nov 2016 09:00:20 +0000 https://www.searchenginewatch.com/2016/11/08/a-penguin-4-0-recovery-case-study/ We’ve had time for the dust to settle after the real-time Penguin 4.0 update was released.  

It’s now been more than four weeks since the Penguin update was implemented, and close to a month since Google announced the end of the launch. The question in everyone’s mind now is – what can we learn from this update? How should we proceed now that Penguin is integrated into the algorithm?

In this article, we’ll delve into an analysis of results pages to determine what can be inferred from this latest update. Unfortunately, we are not at liberty to disclose the site, but we are free to discuss the details and process that we followed to achieve this recovery.

Out of Antarctica

Many have complained that there have been few recoveries from Penguin 4.0. I think it’s important to analyze those sites that recovered so we can identify what works and what will continue to work now that Penguin 4.0 is real time.

Let’s start by looking at the organic traffic patterns according to SEMrush:

As you can see, this is an obvious case of a Penguin recovery. Traffic doubled when Penguin 4.0 was released, and thousands of long tail terms increased in the rankings.

The penalty

What caused this site to receive a Penguin penalty to begin with? This client came to us after engaging in the typical Pre-Penguin linking schemes. They had engaged in article submissions, profile spam, comment spam, web 2.0 spam, and other such links.

At the time of the penalty, the site lost between 40-60% of organic traffic.

The strategy – cleaning out

We started with a link audit, where we mapped their backlinks by patterns. We looked at all of their anchor text backlinks, and disavowed them all.

Next, we looked at sites that fit the typical Pre-Penguin patterns, including article directories, forums, and web 2.0 sites. We disavowed every last one of them.

We didn’t stop there. We looked at every single backlink to look for low quality sites, scrapers, and any other site that wasn’t of value to Google. All of those were disavowed as well.

The strategy – building up

Knowing that many of the links that had previously supported the site’s rankings had been disavowed, we proceeded to earn high quality links.  

Our strategy was not to “build links” like many other companies were doing. Instead, we pursued campaign based digital PR with blogger, media, and press outreach. We focused on quarterly campaigns with high level content creation featuring unique stories and research.

One of the campaigns constituted a survey of over 50,000 participants to explore an industry specific question that hadn’t been previously explored. The findings were visualized with infographics, ebooks, and other digital assets.

The results included coverage in Huffington Post, the Telegraph, USA Today, and LA Times, amongst many others.

The campaigns were complemented by reaching out to bloggers and influencers, and sharing the findings of our study. Instead of offering compensation, bloggers and influencers were compelled by the fact that the study was unique and the data not found elsewhere.  

What should site owners do?

If you still have a Penguin 4.0 penalty, there is hope! Now that Penguin is real-time, you do not have to wait months or years for Google to run an update and to see the results of your work.  

You can now count on seeing recoveries sooner if you go through the effort of cleaning out your link profile and earning quality links.

If you do not have a Penguin penalty, then you can start building links a little more aggressively.

Google announced that the negative links will no longer result in penalties. Instead, the links will simply not count – positive or negative. So if the links are negative, they will not boost your rankings, but will not cause terrible traffic loses like they used to.  

BEWARE

Manual evaluation of links and manual penalties will likely increase. With all of the disavow data Google has sufficient data to apply AI spam filters, which will likely alert human spam raters who will review and apply manual penalties. DO NOT BUILD SPAM LINKS.  

Recovering from manual penalties is infinitely harder than recovering from Penguin – especially now.  

What real-time Penguin means is that you can be a bit more aggressive, without fear of accidentally tripping algorithmic penalties that cause a loss of site wide traffic. And if you DO build a few low quality links along the way, you do not need to fear as much as you would’ve in the past.  

]]>
https://searchenginewatch.com/2016/11/08/a-penguin-4-0-recovery-case-study/feed/ 0
Should you audit your disavow file? https://searchenginewatch.com/2015/10/09/should-you-audit-your-disavow-file/ https://searchenginewatch.com/2015/10/09/should-you-audit-your-disavow-file/#respond Fri, 09 Oct 2015 11:30:00 +0000 https://www.searchenginewatch.com/2015/10/09/should-you-audit-your-disavow-file/ In October 2013, Google gave us the Disavow tool.

This allowed webmasters to instruct Google not to count the link metrics that flowed either through certain links or from certain domains.

If you’ve had a manual penalty or have been dealing with Google’s Penguin algorithm, you’ve probably filed a disavow file.

disavow-links-tool

For the last two years, I have reviewed a large number of disavow files, which often harm sites’ rankings. In some cases, I have suggested auditing the disavow file to determine whether it should be modified and resubmitted.

Here are some possible scenarios in which a site owner may make the decision to thoroughly review their disavow file.

Have you relied on link auditing software to make disavow decisions?

Some link auditing tools can be quite helpful when it comes to organizing your links. But it is vitally important that you take this well-organized list and manually review the disavow suggestions the tool has made.

I have seen far too many businesses blindly take the report generated from one of these tools and submit it directly to Google. Usually when this happens, a large number of unnatural links go undetected and are not disavowed.

Viewing the backlink profile for a site that had relied on an automated tool for disavow decisions recently, I discovered unnatural links from 965 new domains that had not been disavowed. It’s no wonder this site was still struggling with Penguin.

However another problem that I have seen, is that these automated tools will often flag really good links for removal. In one case, an automated tool flagged some valid press mentions – from BBC News, The Guardian, and other great news sources – as unnatural links.

tesla-graveyard

It’s important to remember that the disavow suggestions made by these tools are suggestions, therefore, they need to be reviewed manually.

As such, if you have relied on an automated tool to make disavow decisions, it may be worthwhile to review your disavow file and see if you have accidentally disavowed some good domains. If you have, you can remove those lines from your disavow file and resubmit it.

Google should eventually start to count the link equity from these good sources again. However, it may take a while for the links to start helping again.

For more on this subject read Can you disavow links you have previously disavowed?

Were you super aggressive while removing links for a manual action?

If you’ve ever tried to remove a manual unnatural links penalty from your site, you know that Google can be pretty strict when it comes to giving you that beautiful “spam action revoked” message.

After each failed attempt at reconsideration, site owners often trim away at their links, eventually becoming so desperate that they end up getting rid of almost all them. In many cases, there were some unnecessary disavow decisions.

Auditing a disavow file after an overly aggressive link pruning can be tough. You certainly don’t want to try to game the system and reavow links that are unnatural. But if you feel that you have disavowed links that were actually valid, it may be worthwhile to have another look at your link profile.

A word of caution: If you decide to reavow links, be careful. It may be a good idea to have an impartial party review your reavow decisions to make sure that these links really are decent ones.

Did you hire an inexperienced person to do your link audit?

Sadly this is a very common experience. Knowing which links to disavow can be difficult to determine. No one can accurately predict exactly which links Google wants to see disavowed.

Some decisions are easy, especially when the link is outright spam, but sometimes it can be hard to decide whether to disavow a link or not.

I’ve seen cases where, while performing a link audit, the SEO company decided to disavow every link that was anchored with a keyword.

anchor-text

The issue with this is that not all keyword-anchored links are unnatural. If a major news publication wrote about your company and legitimately linked to you using your keyword as the link anchor, this is a good thing.

Additionally, I’ve witnessed people disavow every single directory link pointing to the site. While directories certainly can be a source of unnatural links, there are many directories that are valid citations and good links. Removing or disavowing links from good directories can destroy your rankings, both in the organic rankings and in the local pack.

I’ve even had cases where small business owners blindly trust an SEO company to do a link audit, only to have that company disavow every single link pointing to their client’s site.

Final thoughts:

My intention in writing this article is not to advise people to try to game Google by reavowing links that were self-made for SEO purposes.

Instead, I would urge you to review your disavow file to see if perhaps you have been too aggressive in disavow decisions. You may find that reavowing some decent links that were disavowed in error eventually results in a positive difference in your rankings.

What do you think? Have you disavowed links in error? What are your experiences and thoughts on reavowing links? Let us know in the comments below. 

]]>
https://searchenginewatch.com/2015/10/09/should-you-audit-your-disavow-file/feed/ 0
Scaling Link-Building: Sustainable Practices for Enterprise Clients https://searchenginewatch.com/2015/09/15/scaling-link-building-sustainable-practices-for-enterprise-clients/ https://searchenginewatch.com/2015/09/15/scaling-link-building-sustainable-practices-for-enterprise-clients/#respond Tue, 15 Sep 2015 13:49:21 +0000 https://www.searchenginewatch.com/2015/09/15/scaling-link-building-sustainable-practices-for-enterprise-clients/ Enterprise clients operate at a very competitive scope, and need links to help fuel their online marketing. Once overlooked, large brands are learning that it takes a strategic and intentional approach to secure the links they need in their online marketing.

Moz and BuzzSumo recently released a study that analyzed 1,000,000 articles, and found no correlation between social shares and links. The same study found that more than 50 percent of articles had zero external links.

This is no small finding. Real businesses are investing into creating content for marketing and audience development, but they are failing to secure the visibility vital to success.

We know that search drives massive amounts of traffic. Also, we know that links remain critical to ranking within competitive SERPs. The point is that good marketing doesn’t de facto leads to links – it leads to link opportunity. This is why it is important to have designated members of your marketing team that are responsible for SEO and links.

You can’t afford to mess up technical SEO for site performance, nor can you afford to miss link opportunities for marketing performance. Technical SEO has long since earned legitimacy, but now link strategy is coming into its own within marketing departments.

For enterprise clients, the need for links is even greater given their competitive scope and the constant deployment of new marketing initiatives.

These clients aren’t looking for flavor-of-the-month marketing: they want scalable and long-term results. This means acquiring links online at a sustainable scale that are able to move the needle in their marketing.

For an example of how SEO and links can help enterprise-level companies improve their online marketing, I’d encourage you to read my post from last month here on Search Engine Watch.

This discusses what it means to successfully scale a link campaign for a client in today’s market.

bridge-scaling-link-building
Scaling Doesn’t Mean Automation

In previous years of SEO, scaling links often equated to automation. Building links that moved the needle was easy because almost every link was able to do so.

Today Google’s Penguin algorithm ensures only relevant links with human value will make a difference in a long-term campaign. Every link needs to be legitimate, critically examined, and defensible.

The value of a link should be threefold:

  1. Valuable for the people who click the link.
  2. Valuable to the site hosting the link.
  3. Valuable to the site being linked.

This should be approached in that order. Any attempt to shortcut to number three and ignoring steps one and two will lead to a less-than-ideal link.

Automated link building was always a shortcut – it leads to spam links and plays no part in today’s marketing environment. Google is now able to police this in their Webmaster Guidelines. Therefore the era of online shortcuts is over – the hypodermic needle or the magic bullet is that of marketing mythos.

Link development is about creating a guiding link strategy, manually promoting the value of your site for links, and intentionally pursuing link opportunities, while simultaneously maximizing human value. Creating a link strategy requires an astute understanding of the company, campaign goals, current and past marketing strategies, the competition, the linking environment, and the linkable assets.

To manually promote your site for links, you have to establish value, a target audience, and note the sites with audience overlap.

Leveraging marketing opportunities for links is all about integration. Any link campaign should integrate with as much of the marketing department as possible – and beyond, if applicable. With integration, your job is to view other online marketing initiatives from the link opportunity perspective in order to make sure any opportunities are realized.

This requires intentional, strategic pursuit, especially as link campaigns will often involve multiple tactics, angles, audiences, and niches.

The only way to scale this type of link development is to also scale both sweat and creativity, and unless robots recently started exhibiting these human qualities – this means scaling rests on the shoulders of humans. Of course, bringing more people into a campaign essentially means paying more for it, and hiring manpower with experience and expertise tends to be expensive.

So how can you justify scaling human effort in order to scale your link development? By growing a campaign after you establish a strong foundational plan that has a high probability for success.

Launching a Link Campaign

Anytime you invest into a new marketing channel – be it paid, earned, owned, partnered, and so forth – it takes time to learn the ins and outs. You won’t operate at 100 percent efficiency right from the start.

This is especially true in link-building, as you must understand your client, the market, the competition, the linking environment, influential community members, and how to best approach other sites.

It simply takes time to successfully launch a link campaign and define project potential. These are some integral and definitive elements that should be prevalent within any campaign:

The Business:

  • Unique business value/unique selling point (USP)
  • Products/services offered
  • Brand voice and message, tone, and positioning
  • Target audience and buyer personas
  • Primary and secondary competitors
  • Existing relationships
  • Industry reputation

The Site:

  • SEO history
  • Linkable assets
  • Keyword themes
  • Backlink analysis

The Niche:

  • Influencers within the niche
  • Prominent industry sites
  • Ideal target audience
  • Linking environment and how willing industry sites link-out
  • Popular tactics within the niche
  • Common relationships within the niche
  • Heavily linked or popular industry content

Campaign Goals:

  • Definition for project success
  • KPIs
  • Link targets
  • Secondary goals
  • Project scope
  • Potential growth

Previous and Current Marketing Strategy Analysis:

  • Past online campaigns
  • Current online campaigns
  • Past offline campaigns with same or similar audience
  • Current offline campaigns
  • Partnerships, events, and community engagement

Competitor Strategy Analysis:

  • Current and previous marketing tactics
  • Backlink analysis
  • Identify popular and ranking pages
  • Competitive keywords
  • Identify online strategies

This is not intended to serve as a comprehensive list, but rather exemplifies some of various considerations often taken in successful link campaigns. Jon Cooper also has a great list of questions you can suggest to new clients here as well.

Launching a campaign takes time, resources, care, and above all else – significant research. You need to truly understand the company, the brand, the client’s expectations, the industry potential, the niche opportunity, current and past marketing efforts, and the competitor’s impact.

Building a strong foundation within a campaign is key to future scalability.

Build a Foundation and Scale with Success

In order to create a campaign capable of scaling, you should establish a deep understanding of the company and brand. Then integrate across the company and identify as many opportunities as possible. Once you’ve officially determined the campaign’s potential, you can begin to scale as appropriate.

It can be overwhelming to scale a link campaign, because it requires clients to invest large amounts into their link projects. You cannot move forward without trust and established value. However, the integration process can give you time to to establish client trust, allowing you to acquire a few worthwhile links to important pages that ultimately prove campaign efficiency.

Then it will be easier to grow the project successfully, both in terms of having the client invest more and growing the project without stumbling.

Scaling human effort can be another difficult obstacle. Every necessary minute spent learning the business, industry, niche, and competitors is a minute not spent securing links. That’s why it’s important to build a strong foundation prior to attempting to scale a project – both to meet client expectations and drive initial momentum.

Link development is more marketing than technical SEO that requires a sustained effort which increases in strength over time – and it all begins with a solid foundation able to support scaling the project.

Takeaways

I will be the first to say that link-building isn’t for everyone.

It takes a company in the right stage of the marketing process to be ready for link development. If you are launching new marketing initiatives, have built link opportunity, and have established value online, you should already be thinking strategically about links. For those ready to invest in strategic link acquisition,

I’d recommend spending ample time in research and discovery in order to cultivate a secure campaign foundation. Once you’ve identified opportunity and campaign potential, then you are prepared to scale the project as necessary. Again, that means scaling characteristics exclusive to humanity – sweat, creativity, and sincere work effort.

If you’re not specifically addressing links within your marketing strategy, you’re hindering your own online marketing. The value of links on the Web isn’t going away. If you’re actively working to acquire links on the Web, I hope you appreciate the wisdom of scaling human effort and not seeking shortcuts.

Homepage image via Shutterstock

]]>
https://searchenginewatch.com/2015/09/15/scaling-link-building-sustainable-practices-for-enterprise-clients/feed/ 0
The Current Status of Google’s Penguin Algorithm https://searchenginewatch.com/2015/09/11/the-current-status-of-googles-penguin-algorithm/ https://searchenginewatch.com/2015/09/11/the-current-status-of-googles-penguin-algorithm/#respond Fri, 11 Sep 2015 13:20:00 +0000 https://www.searchenginewatch.com/2015/09/11/the-current-status-of-googles-penguin-algorithm/ I get a lot of emails saying things like this: “We were hit by Google’s Penguin algorithm. We did a thorough link cleanup six months ago and NOTHING has changed.”

An incredible number of small businesses have been decimated by Penguin. They’re trying to recover but are not succeeding.

Because Google’s updates of the Penguin algorithm are so infrequent, it is not uncommon for it to take a year or more to see the benefits of a link cleanup.

In this article we will review the history of the Penguin algorithm, discuss the current status of Penguin, and we’ll even speculate on what the future of Penguin is likely to be.

The History of Google’s Penguin Algorithm

April 24, 2012 was a day that changed the lives of many people. Prior to this date, it was relatively easy to get a website to rank well.

You could do so by creating links in directories made for links, article galleries, and bookmark sites. The more links the better.

But in April of 2012, Google turned on the Penguin algorithm and made this announcement. As a result, a huge number of websites that had relied on the power of self-made links saw a drastic drop in rankings.

Unfortunately a huge number of non-spammy, legitimate businesses were hit severely as well.

I hear story after story about businesses that hired someone to improve their online presence, and had no clue that they were breaking Google’s Quality Guidelines.

Low quality linking took them from ranking at the bottom of the first page for their keywords to first place rankings.

When Penguin hit, they were beyond page ten with no clue as to why. Many of these businesses that had seen years of incredible growth powered by top Google rankings were forced to lay-off staff or even close down.

On May 25, 2012 – one month after the launch of Penguin – Google launched a Penguin update. There was one reported case of recovery from WPMU.org. This site quickly moved to remove keyword anchored links from the footer of people who used their templates.

I didn’t see any other recovery cases. This is not surprising considering that at this time we had no disavow tool – it was not introduced until October of 2013.

penguin-encounter-6

Penguin Updates and Refreshes

Over the next few years Google ran several updates and refreshes of the Penguin algorithm. There is a difference between a Penguin update and a Penguin refresh. A Penguin update occurs when Google changes the criteria that Penguin uses in its calculations.

In a Penguin refresh, they use the same criteria as the previous time but rerun the algorithm. A site that has been demoted by Penguin is generally not able to recover until either an update or a refresh happens.

Here is a history of the Penguin reruns that we have had so far:

  • April 24, 2012: Initial launch of Penguin
  • May 25, 2102: Penguin Update
  • October 5, 2012: Penguin Update
  • May 22, 2013: Penguin Update
  • October 4, 2013: Penguin Update
  • October 17, 2014: Penguin Refresh

There were also a few possible minor refreshes of Penguin on November 27, 2014, December 2, 2014, December 5, 2014, and December 6, 2014.

If you look at that list, you can see that the last time Penguin updated was October of 2013 – almost two full years ago. Wow – this means that Google has not changed the criteria that they use to determine whether your links are untrustworthy in a long time.

The Last Penguin Refresh

The last official run of Penguin was on October 17, 2014. This was supposed to be a slow rolling change that could take a couple of months to be complete, but it’s quite clear that many sites were affected on particular dates at the end of 2014.

Google ran this refresh primarily to help site owners who had worked hard to clean their links. It is likely that they pushed this out as a response to the outcry from the community of site owners that had been waiting an entire year for Penguin to rerun.

Google employee Pierre Far explains the search giant’s intentions stating, “This refresh helps sites that have already cleaned up the webspam signals discovered in the previous Penguin iteration, and demotes sites with newly discovered spam.”

We definitely did see some sites make nice improvements, such as this one:

google-penguin-3-chart

However, a huge number of sites that had done very thorough link cleanups were unable to recover. For some of these cases, there was a logical explanation as to why no recovery was seen. Here are some possible reasons why:

  • The link cleanup wasn’t thorough enough.
  • Not enough time had passed for Google to recrawl the disavowed links.
  • The site had other issues that were holding it down such as problems with Panda.
  • Its previous good ranking was due to the power of links that are now considered unnatural. Therefore the site did not deserve to rank.

I wrote a thorough article that discusses these points in greater detail. However, I reviewed many cases of sites that I still believe really do deserve to rank well. They did extremely thorough link cleanups – some of them had over two years of thorough disavow work in place. Yet these sites still appeared to get squashed by Penguin.

While it’s possible that my theories on these sites still being suppressed by Penguin is incorrect, I think a likely explanation is that the current framework of Penguin makes it difficult for some sites to recover.

If you did not see improvement with Penguin 3.0 in October of last year – do not give up hope. According to many webspam team members I’ve spoken with, it is clear that Google is aware of the fact that small businesses are struggling to escape the grips of Penguin.

I also believe that Google is working to liberate the sites that make the effort to clean up their link profiles. What is still not clear is how much of a priority this problem represents to the team that is currently working to improve Penguin.

The Future of Penguin

We have now waited almost an entire year since the last official refresh of Penguin, and two years since the last update. I have heard some speculation from the community that perhaps Google will never update Penguin again. It’s a mess. In my opinion, there are two big challenges that are hindering the process:

  1. It’s hard to run a punitive algorithm that doesn’t allow for negative SEO. Google claims to be good at distinguishing negative SEO from self-made links, but I imagine that Google possibly tried to refresh Penguin and found that sites were being unfairly hit by negative SEO during pre-launch tests .
  2. The search results could suffer. What would happen if Google launched a new version of Penguin that could target excessive amounts of guest posts and paid link placements on articles for authoritative sites? I bet that many well-known brands would be found guilty. If Google punished every site that used this tactic, this could possibly make their search results less useful to users.

Is it possible that Penguin will never update? I don’t think that this is true. Google’s Gary Illyes has said that they are working on the update that will be out in a few months:

He also said they are working on making Penguin run more frequently:

Perhaps I am naive, but I do believe that the webspam team is working to make Penguin fair and ultimately better overall. I think we will see a good number of spectacular recoveries with the next update. If you have been waiting for a recovery, then hang in there.

So let’s circle back to the question that started this article off. If you have done a thorough link cleanup and have not seen recovery, it is possible that you just need to wait for Penguin to refresh.

If you submitted your disavow file after October of 2014, then there has not been an official refresh since them. Also those who filed a disavow after July of 2014 are still playing the waiting game because the recrawling and disavowing process generally takes a few months. But have faith – there is a decent chance that you will see positive improvements with the next Penguin update.

What do you think? Are you still waiting for Penguin to update? Do you think that future iterations of Penguin will make recovery more possible?

]]>
https://searchenginewatch.com/2015/09/11/the-current-status-of-googles-penguin-algorithm/feed/ 0
The Perils of Negative SEO: What You Need to Know https://searchenginewatch.com/2015/08/11/the-perils-of-negative-seo-what-you-need-to-know/ https://searchenginewatch.com/2015/08/11/the-perils-of-negative-seo-what-you-need-to-know/#respond Tue, 11 Aug 2015 11:30:00 +0000 https://www.searchenginewatch.com/2015/08/11/the-perils-of-negative-seo-what-you-need-to-know/ We’ve been talking about Negative SEO since the advent of Penguin and many still wonder whether it’s a real threat. The lore is that Google will protect websites from losing traffic due to negative SEO, but is this the truth, or a just myth propagated by Google?

A few days ago, I received an email in my inbox:

negative-seo-email1

negative-seo-email2

Yes, you read it right: large volumes of spam and fake links to destroy your competitor’s rankings. Why spend a fortune on things like content marketing and link attraction, when you can surgically remove your clients with Negative SEO?

The Birth of a New Industry

Think about it: what happened to the thousands of individuals who used to make a living offering cheap link-building services with 10,000 directory submissions and 500 article submissions? Since this black hat approach simply doesn’t work to rank sites anymore – in fact, the effect is the opposite – why not use the same skills for a different product or service?

There are entire companies that have survived by relying on these tactics. Instead of trying to compete in the “content marketing” game, where the ante has been seriously upped, why not stay in their game by obliterating the competition?

You can find dozens of these gigs on Fiverr – as well as many individuals on sites, like freelancer.com and upwork.com – who offer these services. And for those who used to be in the black hat game, firing up software to drop blog comments, forum and profile links doesn’t take much and costs next to nothing.

What Sites are Protected?

As an experienced Penguin Audit Analyst who’s conducted hundreds of audits and penalty recovery campaigns, I’ve learned to identify sites that are vulnerable to attacks. Generally, sites that can sustain Negative SEO attacks have old, established link profiles with enough volume and authority not to be negatively impacted. Large retail brands, old informational sites, niche authorities: these sites can survive most Negative SEO attacks without experiencing a significant loss of traffic.

Other sites, such as those with few links or not many with authority, can be easily impacted by Negative SEO attacks. And unless the site owners are carefully monitoring their link profiles, including monthly link audits and reviews, they may not know until it’s too late and a Penguin penalty or a manual penalty has already been applied.

Recovery from penalties is difficult and time consuming. Even if you jump through all the hoops, your site still may not recover to its pre-penalty days.

How Can You Protect Your Site from a Negative SEO Attack?

Start by monitoring your link profile carefully. Use a variety of tools – Google Webmaster Tools, Bing Webmaster Tools, Ahrefs, Majestic, Moz – to keep track of all of your inbound links. Compile all of your links from all of these sources into one list, using Excel formulas to dedupe, and compare with a previous list to identify new links gained.

Once you have a list of new links acquired, review these links to determine the quality. Are these links healthy? What is the Domain Authority, SEMrush rankings, Trust and Citation Flow? Review various quality markers for each of the sites to determine if they are healthy or not.

Are the sites using keywords in anchor text? If you didn’t build these, then it’s obvious that you are being targeted for a Negative SEO attack.

Using the Disavow Tool

By monitoring your profile carefully and tracking at least every every two to four weeks, you can catch an attack the moment it starts, and update your Disavow file with problematic sites. If you catch them quickly, you can mitigate the possible impact of those sites on your rankings. It’s important to add any potentially suspicious sites, as every low-quality site or anchor text keyword can offset your ratios and trigger a penalty.

Don’t Be a Victim

When I received that email, I knew that thousands of other individuals must have, as well. And as much as we’d like to believe everyone have the best of morals, the reality is that some people will be infinitely attracted to this type of proposal. If one of your competitors takes up a Negative SEO strategy, the results to your business could be devastating.

Do you have any tips on how to keep your site safe from Negative SEO? Do share!

]]>
https://searchenginewatch.com/2015/08/11/the-perils-of-negative-seo-what-you-need-to-know/feed/ 0
Link Quality: 50+ Questions to Ask https://searchenginewatch.com/2015/07/21/link-quality-50-questions-to-ask/ https://searchenginewatch.com/2015/07/21/link-quality-50-questions-to-ask/#respond Tue, 21 Jul 2015 11:30:00 +0000 https://www.searchenginewatch.com/2015/07/21/link-quality-50-questions-to-ask/ One of the worst mistakes new SEOs and site owners make is trying to take shortcuts in link acquisition.

The Penguin algorithm, launched more than three years ago now, has drastically improved Google’s ability to algorithmically detect and punish sites with manipulative links. Google’s ability to detect such links continues to improve every year. If you’re building low-quality links, you’re putting your site and business at risk. Conversely, I’ve witnessed the impressive impact that quality links can have on a site’s visibility, traffic, and engagement.

Recently, in SEMrush’s Twitter chat, the overwhelming response to their question regarding link-building mistakes was trying to take shortcuts. In one form or another, almost every reply dealt with the variety of ways sites and SEOs try to shortcut the link-building process.

Matt Cutts, former Head of Webspam at Google, famously referred to this as “putting the cart before the horse” in his interview with Eric Enge, later citing link building as “sweat plus creativity.”

I’ve created a list of questions you should be asking about your own link-building activities, to ensure you’re only pursuing quality links. Asking these questions will ensure you’re creating links that improve the human experience on the web, as well as your site’s performance in search. Bear in mind that no single question will 100 percent guarantee link quality, but taken together, enough of these questions should shine light on the value of the link.

Without further ado, let’s launch into it.

1. Questions Pertaining to Link Relevance

  • Is the site relevant to your site?
  • Is the site relevant to your page being linked?
  • Is the linking page relevant to your page?
  • Is the link relevant to the content surrounding it? Is it contextually relevant to the page?

2. Questions Pertaining to the Human Value of a Link

  • Is the link a citation, recommendation, or resource?
  • Does the link provide value in the context it’s presented?
  • When a person clicks the link, will they be happy with the result?
  • Would a person be surprised where the link takes them?
  • Does the link improve the page it lives on?
  • Does the link improve the web experience?

3. Questions Pertaining to Site Quality (of the Linking Website)

  • Is the site’s content compelling, clear, thematically relevant, and free of errors?
  • Does the site have clear and present value?
  • Are there real humans associated with the site?
  • Is there an address listed for the physical location of the business or site owner?
  • Are there signs of other humans engaging and interacting with the site?
  • Does the site have a social profile? Things to keep in mind include follower versus followed count; Facebook fans; Google+ interaction; overall engagement and interaction; and content shared.
  • Does the site link primarily to other good sites? Is it a “good link neighborhood?”
  • Is contact information listed for the owner or manager of the website?
  • Does the site appear to be part of a larger network? This could be a potential red flag.
  • Is the design of the website up-to-date?

4. Questions Pertaining to the Marketing Value of a Link

  • Can this link lead to a relationship or partnership?
  • Will this link generate exposure to a new, relevant audience?
  • Will this link generate further exposure to an important site or business asset?
  • Does the link represent a positive brand endorsement?
  • Will the page foster a positive user experience with your brand and site?

5. Questions Pertaining to the SEO Value of a Link

Can Google crawl the link?

  • Is the page crawlable?
  • Is there a meta robots tag? Does it include noindex or nofollow? These will block Google from crawling the page.
  • Is the page blocked in robots.txt?
  • Is the page using AJAX?
  • Is the page using JavaScript? Google can now crawl JavaScript.
  • Is the page indexed?
  • Is the link hidden behind a redirect?
  • Is there anything else that might interfere with Google’s ability to crawl the link?

Tag attributes:

  • Is the nofollow attribute used?
  • Is the link an image? Is the alt attribute tag used?

Anchor text:

  • Is a keyword used in the anchor text?
  • Is the link over-optimized? This is bad, especially if done at scale.
  • Was the anchor editorially created?
  • Did you ask for specific anchor text?
  • Is it a naked URL? (a linked full URL i.e. https://www.google.com/)
  • Is white noise such as “here,” “click,” or “this” used?
  • Does the anchor text appear natural in context?

Link placement:

  • Is the link site-wide? This is typically bad.
  • Is the link on a relevant page?
  • Is the link in content?
  • Is the link high on the page?
  • Are there many other links on the page?
  • Are those links relevant to your page?
  • Are the other links to quality content and sites?

On-page elements:

  • Are the header tags relevant to your page?
  • Is the title tag relevant to your page?
  • Is the URL relevant to your page? Does it contain the word “link” or “list?”
  • Does the page contain 500 words or more?

6. The Gut Check

  • Are we happy to show the link to the client?
  • Are we happy to show the link to colleagues?
  • Would we be happy to show the link to other SEOs?
  • Would we be happy to show the link to family and friends?

This is not a 100 percent comprehensive list – it would be impossible to list every consideration possible to gauge the value of a link. The important thing is to be sure that you’re critically examining your links for value, the real gauge of link quality.

A link should be valuable for:

  1. The people who click the link
  2. The page the link lives on
  3. The site hosting the page
  4. Your page and website
  5. Google, as this link will signal the relevance and authority of your page

There should be no shortcuts in this process – build links that have value, and you won’t need to worry as Google updates their algorithms.

]]>
https://searchenginewatch.com/2015/07/21/link-quality-50-questions-to-ask/feed/ 0
7 Content Growth Hacks to Help Drive More Qualified Traffic https://searchenginewatch.com/2015/07/20/7-content-growth-hacks-to-help-drive-more-qualified-traffic/ https://searchenginewatch.com/2015/07/20/7-content-growth-hacks-to-help-drive-more-qualified-traffic/#respond Mon, 20 Jul 2015 12:30:00 +0000 https://www.searchenginewatch.com/2015/07/20/7-content-growth-hacks-to-help-drive-more-qualified-traffic/ Everybody wants maximum growth out of their content marketing efforts, but not everyone knows how to get those types of results. Sure, tweeting blog posts and emailing relevant resources as part of an outreach campaign are good first steps to getting some traction and exposure. But there’s much more you can do to unlock exponential growth opportunities for your content.

Here are seven growth hacks I leverage every day to get the content I publish to drive even more traffic, reach even more users, and get even more prospects into the sales funnel. And so can you.

1. Freshen Up Your Content

We see every day that Google gives more weight to fresher content. Content that gets updated tends to perform better in the organic SERPs than dormant content. If you look at it from a user perspective, older information is often seen as outdated, potentially less accurate or irrelevant, and can be perceived as having less value.

I recommend routinely updating any content on your site that:

  • Drives conversions. If you have goal-tracking set up in Google Analytics (GA), look at Behavior>Site Content>Landing Pages, filter by Organic Traffic and check page-level goal completions.
  • Drives qualified visits. This refers to any content that’s core to your business and/or fulfills KPIs you’re measuring, in addition to conversions, such as engagement metrics or even raw traffic numbers.
  • Has lost traffic. In GA, use the “compare to” feature. Pick a date range from the previous year to compare to this year’s traffic, and locate pages that have lost traffic over time.

Here are some tips for freshening up this content:

  • Add an “updated” field. Most blog posts have a published date. Change that field to “last updated” to keep it looking timely. Or, you can physically add in a “last updated” line of text in the body copy somewhere on the page.
  • Omit dates in your file paths. If you’re starting a new blog or reconfiguring your site, omit dates from your URL file path schemes. Don’t intentionally date your content if you don’t have to.
  • Reuse annual posts. If you publish annual resource posts, like the “Most Popular [X Product] of 2012,” don’t start from scratch with a new page each year. Keep the same file/URL and rework the content instead. Leverage the trust and authority that page has already accrued, and don’t put dates in the file name. Date the post title and title tag instead. Both can be updated.

If you’re skeptical about the impact of freshness, follow the above recommendations, update 10 posts on your site that have lost traffic recently, and see what happens.

2. Promote Other People’s Work

A great way to amplify the reach of your content is to promote the work and the wisdom of others. Including the thoughts or writings – or even the products – of others in your content adds a layer of “built-in distribution.” Anyone featured in your content is incentivized to help share it because promoting your content, they’re promoting themselves, too.

You can feature others in your content by:

  • Including a quote. This can be a quote you get directly from the source or one you pull from something they’ve already written, even if it’s a simple Tweet. Just make sure to cite the source.
  • Running a solo or group interviewI’ve recently contributed to group interviews here, here and here. I’ve shared those articles on social and now I’m linking back to them. See how that works?
  • Linking out. If there are key industry thought leaders, influencers or bloggers you’re targeting, regularly include links to anything they’ve published, be it an article, an eBook, or their blog.

Make sure you let the person, website or company know that you’ve featured them on your site. Shoot them an email or a Tweet to get on their radar. Compliment them, but don’t explicitly beg for anything in return. Let the reciprocation happen naturally.

You’d be surprised how effective this technique is for increasing the reach of your content. What’s more, if you’re active in nurturing these relationships, it can lead to even more growth opportunities.

3. Buy Some Traffic

One surefire way to get more traffic to your content is to set up a paid campaign on StumbleUpon. You can also buy paid placement on Reddit, Facebook, Twitter and LinkedIn (more info here), but I find StumbleUpon to deliver the most predictable results, and be the easiest to work with.

StumbleUpon can generate instant exposure for your content at scale, and quickly: you can launch a new campaign – which has a range of audience targeting options – in 30 seconds. You can plug in your URL and go, without creating and testing ad copy, and “engaged visitors” only cost 15 cents.

The flipside to buying traffic is that you’re not “earning” it by actively building relationships and a community. So the traffic is generally of lower value and engagement scores aren’t ideal, with less time spent on the site, high bounce rates, and no deeper visits. What’s more, once the budget runs out, the traffic does, too.

However, those critiques don’t negate the power of using paid discovery as a tool to amplify exposure and distribution opportunities for your content. This is particularly true for new sites or publications that haven’t built an audience yet.

4. Go Local

Including a regional or local angle in your content lets you harness the awesome power of parochialism. People enthusiastically share and distribute content that touts their city, town or state as “tops” or “best” at something, even if the source is dubious.

These lists serve as a point of pride for residents and reinforce their decision to live in a particular area. They’re effective click bait because they pique our curiosity. What parent doesn’t want to know if their child’s school made the list of “top public schools” in their state?

Examples of content with a regional angle include the most Googled brands in each state and a Business Insider feature on New England slang. When it comes to promotion, target local journalists and radio station websites. They frequently feature this type of content, so it’s a perfect angle to pitch when doing outreach. You can buy ads on Facebook and leverage geo-targeting, as well.

Localism works even for the most obscure niches. No matter how boring your industry, you can still find ways to work in a regional angle to appeal to a local audience.

5. Recirculate Older Content

Most content has a relatively short lifecycle. It gets published and promoted, and then disappears into the dark archival abyss as new content gets rolled out. It’s rare for older content to get recirculated and re-shared, which is a major missed opportunity for getting maximum value out of your content assets.

If content is successful, you should continue to share it even if weeks or months old, so you can:

  • Reaching new audiences. The average lifespan of a tweeted link is 18 minutes, so only a small segment of your followers have seen it. And it’s very unlikely any new followers have seen it, either. If you’re worried that you’ll annoy your audience, don’t be.
  • Getting more traffic. Recirculating content on your social channels and re-shares from your audience helps drive new visits to older content pieces.
  • Boosting ranking signals. Page-level signals like links, shares and brand mentions boost organic rankings, either directly or indirectly. If your older content is collecting dust, there’s zero chance of getting more signal to these key pages. Re-sharing them renews this opportunity.

For content to recirculate, I recommend you target anything that’s already been successful, such as articles with high share counts; high engagement metrics, such as pages-per-visit and time spent on page; and a lot of comments. Finally, if you’re like me and you aim to scale your efforts, set your social recirculation efforts on autopilot.

6. Publish on Popular Topics

One sure-fire way to drive more qualified traffic to your site is to target popular topics. Audiences have already demonstrated that they want and value this type of information.

To find popular topics, you can leverage keyword tools (here’s a great list); look at competitor content that’s already generated a high degree of engagement; pull an internal site search report in Google Analytics and audit your existing content to see what people are searching for and engaging with. There’s also a range of sources I detailed in a previous column.

7. Amplify Your Referred Visits

Referred traffic, or traffic sent to your site from other domains, is one of the most overlooked segments of traffic. In many cases, referral traffic can comprise a sizeable percentage of your visits, as well as generate leads and conversions.

Wouldn’t it be great if you could increase the amount traffic these referrers are sending to your site? You can. Here’s how.

  1. Start by gathering a list of top referrers to your site from your analytics report. Make sure to drill down and determine the specific URL/page on that site is sending the traffic.
  2. Drop those URLs in a tool like SEM Rush and discover which keywords that page ranks for, and which of those terms yield the most traffic to help inform anchor text strategy.
  3. Build links to that page to help it rank higher for the traffic-generating terms you pulled in the previous step.

When building these links, be cautious. If you’re too aggressive, you risk getting the page dinged by Google’s Penguin algorithm. Not only will this kill the stream of referred traffic to your site, but it would hurt the domain for the site you’re targeting.
Resist the urge to fire hundreds of commercial anchor links to the page, using an automated backlink generation tool.

Instead, try the following approaches:

  • Add links in contributed content. If you write for other sites, try to work in a link (naturally) to the page. If you don’t have contributor gigs, then line some up.
  • Add links from your own site. You can also link over to pages on the referrer’s site from your own sites. Again, link naturally and when using anchors, use caution. Branded anchors, mixed commercial anchors, hyperlinked URLs or prop words are the safest plays.
  • Promote on social. Actively share high-value referrer pages across social media. Links from Twitter are nofollow and don’t count, but exposing this content to your audience increases the chances it will get linked to.

By building links, you can improve the visibility for that referrer’s page in the organic SERPs. This leads to more traffic to that page, which, in turn, increases the potential for more referred visits to your site.

]]>
https://searchenginewatch.com/2015/07/20/7-content-growth-hacks-to-help-drive-more-qualified-traffic/feed/ 0
Panda and Penguin Are Not Penalties https://searchenginewatch.com/2015/07/15/panda-and-penguin-are-not-penalties/ https://searchenginewatch.com/2015/07/15/panda-and-penguin-are-not-penalties/#respond Wed, 15 Jul 2015 10:30:00 +0000 https://www.searchenginewatch.com/2015/07/15/panda-and-penguin-are-not-penalties/ Did you know that Penguin and Panda are NOT penalties? It seems hard to believe that this is true, especially if your website is experiencing horrendous ranking drops because it has been affected by one of these changes. However, Google is quite adamant that we should not be calling these algorithmic changes penalties.

In John Mueller’s Google Webmaster Help Hangouts, any time someone mentions a ‘Penguin penalty’ or ‘Panda penalty’, he is quick to say that these algorithms are NOT penalties. “From our point of view, Penguin isn’t a penalty. It’s essentially a search quality algorithm… a penalty is something that is done manually”, he said.

When asked a question about recovering from a ‘Penguin penalty’, John’s answer was: “We see Penguin as an algorithm. It’s not something we’d see as a penalty… It’s not something that’s either on or off. It’s something where we look at the signals that we have and we try to find the right way to adjust for that.”

What is a Google penalty?

Google penalties do indeed exist. Google can manually penalize a site for things such as inbound unnatural links, outbound unnatural links, thin content, pure spam and more. The most common reason for a site to be manually penalized is because someone has reported the site to Google. Most people also believe that Google manually reviews the top sites in several competitive search results as well, but we don’t know this for sure.

If your site has a manual penalty, you will see evidence of this in your Google Search Console (formerly called ‘Webmaster Tools’). To see if you have a manual penalty, go to Search Traffic → Manual Actions. You’ll either see a penalty like this:

sitewide-message

Or, if you have no penalty you will see this:

no-manual-action

Important Note: You will not always see evidence of a penalty in the ‘messages’ section of the Google Search Console (Webmaster Tools). If you were added to Google Search Console for this site before the site was penalized, then you should see a message that looks something like this:

unnatural-links-message

However, if you became an owner or restricted owner after the penalty message was initially received, there will be no penalty message for you to see in the messages section. In this case, you will still be able to see the penalty in the Manual Actions Viewer though. Hopefully this is something that Google will change in the future. It would be quite helpful to be able to see the past site messages for a newly verified owner.

If you have a manual penalty, once you have cleaned up your site you can file for reconsideration. If you have done a thorough job, then a Google employee will manually remove your penalty. Something that changed in 2013 was that only sites that had a manual action could apply for reconsideration. Prior to this, anyone could file a reconsideration request, even if there was no manual penalty. For sites that were only affected algorithmically, you would get an automated response back telling you that there was no manual penalty. But now, you can’t file for reconsideration unless you actually have a manual penalty.

What is an Algorithmic Filter?

Google’s algorithms are immensely complex. There are parts of the algorithm that are constantly evaluating websites and modifying their rank depending on what they see. For example, Google’s keyword stuffing algorithm re-evaluates your site each time that Google crawls it. There are other parts of the algorithm, however, that we call filters. Filters are modifications that only take effect when Google decides to run them. Penguin and Panda are filters.

If either of these algorithms determines that your website is not a high quality site (or does not have high quality backlinks, in the case of Penguin), then Google will adjust the algorithm so that your site does not rank as well. If you have lots of issues, you can be affected severely. If you have just a few issues, you may see just a minor rank deduction.

I look at Penguin and Panda as if they were like sandbags that are holding a hot air balloon down. A site with serious issues can have very heavy sandbags that pull the site down and make it almost impossible for the site to rise in rankings unless those sandbags are removed. A site with minor issues might have lighter sandbags applied. These smaller weights still pull the site down somewhat, but not as severely.

Here are some things that separate manual penalties from algorithmic filters:

  • A manual penalty is manually applied by a member of Google’s webspam team. An algorithmic filter is an automatic thing.
  • You can’t file for reconsideration to get an algorithmic filter removed.
  • With a manual penalty, once you’ve cleaned up, and successfully requested reconsideration, the penalty is lifted. With an algorithmic filter, you need to improve your site and then wait for the algorithm (Penguin or Panda) to either update or refresh and reassess your site.
  • A manual penalty is either on or off. There can be cases where a severe penalty can be downgraded to a less severe one such as having a sitewide unnatural links penalty downgraded to a partial but in general a manual penalty is either there or it’s not. But, with an algorithmic filter, you can be affected to different degrees. Not all algorithmic hits are drastic.
  • There is no way of telling whether you are being demoted by an algorithmic filter. Google employees have a console where they can see whether a site is being affected by Panda or Penguin, but webmasters can’t see this. Oh how I wish Google would allow us to see whether we are dealing with an algorithmic filter! If you can see a drop in traffic that coincides the the date of a known or suspected refresh or update of Panda or Penguin, then this is a good hint that you are dealing with one of these issues. However, not all refreshes are announced. And, in the future, Google plans to incorporate both of these algorithms into the main algorithm so it is going to be hard to determine what needs to be done in order to see recovery.
  • With an algorithmic filter, there is no way of knowing whether you’ve done enough work to escape the filter once it re-runs. If you clean up your backlinks, Penguin refreshes and you see a mild improvement, there’s no way of knowing whether you would have seen more improvement if you had removed or disavowed more links. You can’t tell whether you still have a mild case of Penguin or whether the site is completely free of algorithmic sandbags holding it down. Similarly, Google doesn’t tell you what type of on-site quality issues they want to see cleaned up for Panda. We take our best guess when doing a Panda cleanup, but if Google is taking issue with something that we haven’t addressed and is still suppressing the rankings for that site, there’s no way to know.

Should We Be Calling Panda and Penguin Penalties?

Do a Google search for “Panda penalty” or “Penguin penalty” and you’ll see some well known SEO professionals using this terminology. Is it wrong to do so? In my mind it’s all semantics. If you want to sound like someone who really understands Google’s algorithms, it’s probably best to refer to Panda and Penguin as algorithmic filters rather than penalties. But, when I’m talking to a small business owner who has had their revenue severely cut because they’re stuck under an algorithmic filter, I certainly don’t correct them when they say they are being penalized.

I feel that Google has done a good job at cleaning up the search results for the most part. When someone searches for information on car insurance, they’re not likely to see some scuzzy buy-cheap-car-insurance-online-now.biz site that got to the top of Google by manipulating the PageRank flowing to the site. As a user, I generally am getting better results now than I did a few years ago. It’s good for Google to show the most relevant results possible. But, these filters are causing so many businesses to suffer severely. Some made poor decisions in hiring a low quality SEO to build links to their site. Others don’t even know what they did wrong, but are the victim of site quality issues that perhaps are caused by a faulty CMS. In my opinion, there needs to be a better way for sites like this to be able to recover.

What Do You Think?

Have you been negatively affected by Panda or Penguin? Do you think we should be calling these penalties?

]]>
https://searchenginewatch.com/2015/07/15/panda-and-penguin-are-not-penalties/feed/ 0
Simple SEO Mistakes That Can Cause Damage https://searchenginewatch.com/2015/07/07/simple-seo-mistakes-that-can-cause-damage/ https://searchenginewatch.com/2015/07/07/simple-seo-mistakes-that-can-cause-damage/#respond Tue, 07 Jul 2015 10:30:00 +0000 https://www.searchenginewatch.com/2015/07/07/simple-seo-mistakes-that-can-cause-damage/ For the average website owner, getting into trouble with Google’s organic search algorithm can happen accidentally. Between optimization, architecture, getting links and structuring data, it’s easy to make a misstep or three. Simple mistakes can unwittingly put sites at risk for ranking loss, manual penalties, and algorithmic filters.

With changes that consistently tighten Google’s “quality” belt, there are inevitably winners and losers. The slap-down nature of algorithm updates are intended to improve the quality of search results, but they’ve also created a landscape where it’s entirely possible to mess up, even while trying to do things right.

Structured Markup

Structured data penalties, recently invigorated after Google’s recent Quality Update, offer slightly newer opportunities to enter the danger zone – cue the Kenny Loggins.
Targeting marked-up pages that include invisible, misleading or irrelevant content makes sense on Google’s part, but the notion of relevance does create some gray area of subjectivity. Moreover, the specificity of the guidelines surrounding ratings and reviews on individual products, rather than lists or categories, is such a precise detail that it would be easy to inadvertently implement them on disqualified pages.

While understanding the nuance of the guidelines is the first line of protection, a larger reflection on the purpose of structured markup provides an even clearer directive. Principally, consider how it adds to the user experience in a way that is more significant than the visual effect or SEO value.

Certain markups should always be about making it easier to navigate to a place or a product on the site directly from search. But it’s only of value when this is the exact information a user intended to get. Reviews and ratings applied to categories does not provide the granular level of feedback a searcher may be expecting. In any situation, if the markup only makes it easier for a user to get to the wrong information, that’s when there is a problem.

Individual Page Value

There are so many ways page creation and optimization can go wrong, putting a site at risk for Panda problems. New URLs created by search parameters, internal search results, and quick views can get cached inflating a site’s index with inauthentic, low-value pages. The addition of new pages to provide destinations for searchers can be perceived as low quality if they are lacking in distinct user value.

Legitimate features like coupon codes, maps, listings and definitions can be considered thin by the succinct nature of the information presented. These innocuous and useful resources can be perceived as insubstantial, particularly when they comprise a considerable portion of the site’s entire composition. In all these cases, the offending behavior may be a result of small cracks an SEO foundation, rather than a willful attempt at manipulation.

Link Building

Link building has become a minefield. Old, misbegotten links can fester. An overabundance of keyword anchor text, purposeful or not, can incur wrath. Directory placement, certain guest blogs, syndication and a number of once-popular – and unfortunately, still available – tactics can cause a site’s rankings to plummet. Then comes the difficult process of a reconsideration request in manual cases or the painful limbo of waiting between Penguin rollouts.

While these refreshes may eventually be integrated into the standard algorithm, for now they remain few and far between. Getting in bed with the wrong link provider or even failing to keep a closely monitored profile can result in bad links that can pop up like weeds in a garden. And just like weeds, if enough of them are allowed to invade, they can choke out the growth you’ve carefully cultivated.

Scrutiny is Safety

Even if current iterations of Panda and Penguin are causing less widespread devastation than in the past, continuing data refreshes can still hurt. Minor oversights in the areas of links, markup or content errors can become ticking time bombs.

If the core of your SEO strategy is users – rather than search engines and creating quality content, building relationships and leveraging multiple channels for brand visibility – you’re already on the right course.

But if good intentions can pave the way to hell, even the best SEO intentions can take a wrong turn. Having people on your team who are monitoring the evolution of search engine changes and can apply that insight as it relates to all areas of planning and implementation is crucial. When it comes to search the big picture perspective and understanding the granular details become equally important for creating a safe and thriving strategy.

]]>
https://searchenginewatch.com/2015/07/07/simple-seo-mistakes-that-can-cause-damage/feed/ 0