Google Webmaster Tools – Search Engine Watch https://searchenginewatch.com Tue, 25 Feb 2020 17:51:57 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 Google Search Console: a complete overview https://searchenginewatch.com/2016/05/09/google-search-console-a-complete-overview/ https://searchenginewatch.com/2016/05/09/google-search-console-a-complete-overview/#respond Mon, 09 May 2016 13:05:03 +0000 https://www.searchenginewatch.com/2016/05/09/google-search-console-a-complete-overview/ The Search Console (or Google Webmaster Tools as it used to be known) is a completely free and indispensably useful service offered by Google to all webmasters.

Although you certainly don’t have to be signed up to Search Console in order to be crawled and indexed by Google, it can definitely help with optimising your site and its content for search.

Search Console Dashboard

Search Console is where you can monitor your site’s performance, identify issues, submit content for crawling, remove content you don’t want indexed, view the search queries that brought visitors to your site, monitor backlinks… there’s lots of good stuff here.

Perhaps most importantly though, Search Console is where Google will communicate with you should anything go wrong (crawling errors, manual penalties, increase in 404 pages, malware detected, etc.)

If you don’t have a Search Console account, then you should get one now. You may find that you won’t actually need some of the other fancier, more expensive tools that essentially do the same thing.

To get started, all you need is a Google sign-in, which you probably already have if you regularly use Google or Gmail, and visit Search Console.

Then follow this complete guide which will take you through every tool and feature, as clearly and concisely as possible.

Please note: we published a guide to the old Webmaster Tools service, written by Simon Heseltine, back in 2014. This is an updated, rewritten version that reflects the changes and updates to Search Console since, but much of the credit should go to Simon for laying the original groundwork.

Quick Links:

Add a property

If you haven’t already, you will have to add your website to Search Console.

Just click on the big red Add a Property button, then add your URL to the pop-up box.

add property in search console

Verification

Before Search Console can access your site, you have to prove to Google that you’re an authorized webmaster. You don’t have be in charge, but you do need permission from whoever is.

There are five methods of verification for Search Console There’s no real preference as to which method you use, although Google does give prominence to its ‘recommended method’…

1) The HTML file upload: Google provides you with a HTML verification file that you need to upload to the root directory of your site. Once you’ve done that, you just click on the provided URL, hit the verify button and you’ll have full access to Search Console data for the site.

verify your site in search console

There are also four alternative methods if the above doesn’t suit…

alternate methods of uploading to Search Console2) HTML tag: this provides you with a meta tag that needs to be inserted in the <head> section of your homepage, before the first <body> section.

If you make any further updates to the HTML of your homepage, make sure the tag is still in place, otherwise your verification will be revoked. If this does happen, you’ll just have to go through the process again.

3) Domain Name Provider: here you’re presented with a drop down list of domain registrars or name providers, then Google will give you a step-by-step guide for inserting a TXT record to your DNS configuration.

4) Google Analytics: assuming you’re using Google Analytics and your Google account is the same one you’re using for Search Console, then you can verify the site this way, as long as the GA code is in the <head> section of your home page (and remains there), and you have ‘edit’ permission.

5) Google Tag Manager: this option allows you to use your own Google Tag Manager account to verify your site, providing you’re using the ‘container snippet’ and you have ‘manage’ permission.

Now that you’re verified, you’ll be able to see your site on the Home screen. (As well as any sites you’re also a webmaster for). Here you can access the site, add another property and see how many unread messages you’ve received from Google.

Search Console Home

if you click on your site, you will be taken to its own unique Dashboard.

For the purposes of the following walk-throughs, I’ll be using my own website Methods Unsound, which means you can see all the things I need to fix and optimise in my own project.

Dashboard

Here’s where you can access all of your site’s data, adjust your settings and see how many unread messages you have.

Search Console Dashboard

The left-hand Dashboard Menu is where you can navigate to all the reports and tools at your disposal.

The three visualisations presented on the Dashboard itself (Crawl Errors, Search Analytics, and Sitemaps) are quick glimpses at your general site health and crawlability. These act as short-cuts to reports found in the left-hand menu, so we’ll cover these as we walk-through the tools.

Also note that Google may communicate a message directly on the dashboard, if it’s deemed important enough to be pulled out of your Messages. As you can see I have errors on my AMP pages that need fixing, but we’ll look at this when we get to the Dashboard Menu section further down.

First let’s take a look at settings…

Settings

Clicking on the gear icon in the top right corner will give you access to a variety of simple tools, preferences and admin features.

search console preferences

Search Console Preferences

This is simply where you can set your email preferences. Google promises not to spam you with incessant emails so it’s best to opt-in.

Search Console Preferences - email

Site Settings

Here’s where you can set your preferred domain and crawl rate.

site settings

  • Preferred domain let’s you set which version of your site you’d like indexed and whether your site shows up in search results with the www prefix or without it. Links may point to your site using http://www.example.com or http://example.com, but choosing a preference here will set how the URL is displayed in search.Google states that: “If you don’t specify a preferred domain, we may treat the www and non-www versions of the domain as separate references to separate pages” thus cannibalising your search visibility.
  • Crawl rate lets you slow down the rate that Googlebots crawls your site. You only need to do this if you’re having server issues and crawling is definitely responsible for slowing down the speed of your server. Google has pretty sophisticated algorithms to make sure your site isn’t hit by Googlebots too often, so this is a rare occurrence.

Change of Address

This is where you tell Google if you’ve migrated your entire site to a new domain.

Search Console Change of Address

Once your new site is live and you’ve permanently 301 redirected the content from your old site to the new one, you can add the new site to Search Console (following the Add a Property instructions from earlier). You can then check the 301 redirects work properly, check all your verification methods are still intact on both old and new sites, then submit your change of address.

This will help Google index your new site quicker, rather than if you just left the Googlebots to detect all your 301 redirects on their own accord.

Google Analytics Property

If you want to see Search Console data in Google Analytics, you can use this tool to associate a site with your GA account and link it directly with your reports.

Search Console Google Analytics Property

If you don’t have Google Analytics, there’s a link at the bottom of the page to set up a new account.

Users & Property Owners

Here you can see all the authorized users of the Search Console account, and their level of access.

Search Console Users and Property Owners

You can add new users here and set their permission level.

  • Anyone listed as an Owner will have permission to access every report and tool in Search Console.
  • Full permission users can do everything except add users, link a GA account, and inform Google of a change of address.
  • Those with Restricted permission have the same restrictions as Full permission users plus they only have limited viewing capabilities on data such as crawl errors and malware infections. Also they cannot submit sitemaps, URLs, reconsideration requests or request URL removals.

Verification Details

This lets you see the all the users of your Search Console account, their personal email addresses and how they were verified (including all unsuccessful attempts.)

Search Console verification details

You can unverify individuals here (providing you’re the owner).

Associates

Another Google platform, such as a G+ or AdWords, can be associated (or connected) with your website through Search Console. if you allow this association request, it will grant them capabilities specific to the platform they are associating with you.

Here’s an example direct from Google: “Associating a mobile app with a website tells Google Search to show search result links that point to the app rather than the website when appropriate.”

If you add an associate, they won’t be able to see any data in Search Console, but they can do things like publish apps or extensions to the Chrome Web Store on behalf of your site.

Search Console associates

Here’s where you’ll find all your reports and tools available in the Search Console.

Search Console Dashboard menu

Let’s look at each option one-by-one.

Messages

Here’s where Google communicates with webmasters.

Search Console All Messages

Again, you won’t get spammed here as Google promises not to bombard you with more than a couple of messages a month. You do need to pay attention when you do receive one though as this is where you’ll be informed if your site’s health is compromised.

This can be anything from a rise in 404 pages, to issues with crawling your site, or even more serious problems like your site being infected with malware.

Search Appearance

If you click on the ? icon to the right of ‘Search Appearance’ a handy pop-up will appear. Search Appearance Overview breaks down and explains each element of the search engine results page (SERP).

Search appearance Dashboard

By clicking on each individual element, an extra box of information will appear telling you how to optimise that element to influence click-through, and where to find extra optimisation guidance within Search Console.

Search Console Dashboard explainer

Structured Data

Structured data is a way for a webmaster to add information to their site that informs Google about the context of any given webpage and how it should appear in search results.

For example, you can add star ratings, calorie counts, images or customer ratings to your webpage’s structured data and these may appear in the snippets of search results.

captain america civil war review rich snippet

The Structured Data section in Search Console contains information about all the structured data elements Google has located on your site, whether from Schema markup or other microformats.

structured data in search console

It will also show you any errors it has found while crawling your structured data. If you click on the individual ‘Data Types’ it will show you exactly which URLs contain that particular markup and when it was detected.

If you click one of the URLs listed, you can see a further breakdown of the data, as well as a tool to show exactly how it looks in live search results. Just click on ‘Test Live Data’ and it will fetch and validate the URL using Google’s Structured Data Testing Tool.

Search Console Structured Data test

Data Highlighter

Data Highlighter is an alternative to adding structured data to your HTML. As the explainer video below says, it’s a point and click tool where you can upload any webpage then highlight various elements to tell Google how you want that page to appear in search results.

There’s no need to implement any code on the website itself and you can set the Data Highlighter so it tags similar pages for you automatically.

To begin, click on the big red ‘Start Highlighting’ button…

Search Console Data Highlighter

Then enter the URL you wish to markup…

Search Console Data Highlighter upload

Then start highlighting and tagging…

structured data highlighter

After you hit publish, Google will take your added structured data into account once it has recrawled your site. You can also remove any structured data by clicking ‘Unpublish’ on the same page if you change your mind.

HTML Improvements

This is where Search Console will recommend any improvements to your meta descriptions and title tags, as well as informing you of any non-indexable content.

Search Console HTML Improvements

This is a very handy, easy-to-use feature that gives you optimisation recommendations that you can action right away.

For instance, if I click on the ‘Short meta descriptions’ link, I’ll be able to see the 14 URLs and their respective meta descriptions. I can then go into each one of these pages in my own CMS and add lengthier, more pertinent text.

Search Console HTML Improvements meta descriptions

Title tags and meta descriptions should be unique for each page and fall within certain character lengths, so for the purposes of both user experience and keeping Google informed about your site, this is a worthwhile report.

Sitelinks

Sitelinks are the subcategories that appear under the main URL when you search for a brand or a publisher.

sitelinks example

Sadly you can’t specify to Google which categories you want highlighted here, but if you’re popular enough and your site’s architecture is solid enough then these will occur organically.

However in the Sitelinks section of Search Console, you can tell Google to remove a webpage that you DON’T wish to be included as a sitelink in your search results.

Search Console Sitelinks

Accelerated Mobile Pages

This is a brand new tool, as Google’s AMP programme has only been available since earlier this year. AMP is a way for webmasters to serve fast-loading, stripped down webpages specifically to mobile users. Site speed and mobile friendliness are considered ranking signals so this is an important feature, although some SEOs are slow to adopt it.

As you can see from the report below, we’ve just started introducing AMP to our webpages and making a bit of a hash of it…

Search Console Accelerated Mobile Pages report

Accelerated Mobile Pages lets you see all the pages on your site with AMP implemented and which ones have errors. If you click on the error, you can see a list of your URLs with errors. Then by clicking on the URL, you will be recommended a fix by Google.

Search Console Accelerated Mobile Pages fix

Clearly we have some custom JavaScript issues on our site that need addressing. If you click on the ‘Open Page’ button, you can see exactly how your AMP content appears on mobile.

Search Traffic

Search Analytics

Search Analytics tells you how much traffic you get from search, revealing clicks and impressions delivered on SERPs. It will also work out your click-through rate (CTR) and reveal your average organic position for each page.

And here’s the *really* good stuff… you can also see the queries that searchers are using in order to be served your site’s content.

Search Console Search Analytics

The data for this is collected differently from Google Analytics, so don’t expect it to tally, however what this feature is really useful for is seeing which keywords and phrases are driving traffic to your site, as well as individual traffic-generating pages.

You can toggle between a variety of options, filters and date-ranges. I highly recommend looking at Impressions and CTR, to see which pages are generating high visibility but low click-through rate. Perhaps all these pages need is a tweak of a meta-description or some structured data?

Links to Your Site

Here’s where you can see the domains that link to your site and its content the most, as well as your most linked webpages.

Search Console Links to Your Site

This isn’t an exhaustive list, but a good indicator of where your content is appreciated enough to be linked. Clicking on the URLs on the right hand-side will show where they’re being linked to individually.

Internal Links

Here is where you can see how often each page on your site has been internally linked. Clicking on each ‘Target page’ will show a list of URLs where the internal link occurs.

Search Console Internal Links

There is a limit to how many ‘Target pages’ Search Console will show you, but if you have a small number of pages you can reverse the sort order and see which target pages have zero internal links. You can then go into your site and give these pages an internal link, or redirect them to somewhere else if they’re old legacy pages.

Manual Actions

This is where Google will inform you if it has administered a manual action to your site or specific webpage.

GWT Manual Actions

Google will offer any recommendations for you to act upon here, and will give you the chance to resubmit your site for reconsideration after you’ve fixed any problems.

Here’s a guide to what Google will most likely give you a manual penalty for and how you can avoid it.

International Targeting

Here you can target an audience based on language and country.

Search Console International Targeting

  • Country: If you have a neutral top-level domain (.com or .org), geotargeting helps Google determine how your site appears in search results, particularly for geographic queries. Just pick your chosen country from the drop-down menu. If you don’t want your site associated with any country, select ‘Unlisted’.
  • Language: If you manage a website for users speaking a different language, you need to make sure that search results display the correct version of your pages. To do this, insert hreflang tags in your site’s HTML, as this is what Google uses to match a user’s language preference to the right version of your pages. Or alternatively you can use sitemaps to submit language and regional alternatives for your pages.

Mobile usability

As mobile has overtaken desktop for searches this year, obviously your site has to be mobile-friendly, otherwise you’re providing a poor user experience to potentially half your visitors.

This report tells you of any issues your site has with mobile usability. And you’ll really want to be seeing the following message, as Google explicitly states you’ll otherwise be demoted.

Search Console Mobile Usability

Possible errors that will be highlighted by Search Console here include:

  • Flash usage: mobile browsers do not render Flash-based content, so don’t use it.
  • Viewport not configured: visitors to your site use a variety of devices with differing screen sizes so your pages should specify a viewport using the meta viewport tag.
  • Fixed-width viewport: viewports fixed to a pixel-size width will flag up errors. Responsive design should help solve this.
  • Content not sized to viewport: if a user has to scroll horizontally to see words and images, this will come up as an error.
  • Small font size: if your font size is too small to be legible and requires mobile users to ‘pinch to zoom’ this will need to be changed.
  • Touch elements too close: tappable buttons that are too close together can be a nightmare for mobile visitors trying to navigate your site.
  • Interstitial usage: Google will penalise you if you’re using a full-screen interstitial pop-up to advertise an app when a user visits your mobile site.

Google Index

Index Status

This lets you know how many pages of your website are currently included in Google’s index.

Search Console Index Status

You can quickly see any worrying trends from the last year (for instance that little dip in May 2015), as well as any pages that have been blocked by robots or removed.

Content Keywords

Here you can see the most common keywords found by the Googlebots as they last crawled your site.

Search Console Content Keywords

If you click on each keyword, you’ll be able to see the other synonyms found for that keyword, as well as the number of occurrences.

As Simon Heseltine suggests, look out for unexpected, unrelated keywords showing up as it’s an indication your site may have been hacked and hidden keywords have been injected into your pages.

Blocked resources

This section lets you know of any images, CSS, JavaScript or other resources on your site that’s blocked to Googlebots.

Search Console Blocked Resources

These are listed by host-name, then by specific pages, which you can follow steps to diagnose and resolve.

Remove URLs

Where essentially you can make your content disappear from Google.

remove urls search console

This only acts as a temporary fix, but by the time you’ve done this and either deleted your offending webpage or 301 redirected it elsewhere, there theoretically should no longer be a record of it.

Just enter the URL then select whether you want it removed from the search results and the cache, just from the cache or if you want an entire directory removed.

Be warned: this request can take between two to 12 hours to be processed.

Crawl

Crawl Errors

This report shows all the errors that Google has found when crawling your site over the last 90 days.

Search Console Crawl Errors

Site errors: the top half of the screen shows three tabs, where if you click on each you can see any past problems with your DNS, your server connectivity or whether a crawl had to be postponed. (Google will postpone a crawl rather than risk crawling URLs you don’t want indexed).

URL errors: the bottom half of the screen shows URL errors for desktop, smartphone and feature phone (a phone that can access the internet, but doesn’t have the advanced features of a smartphone).

You’ll likely see reports for the following on all three device types:

  • Server error: Google can’t access your site because the server is too slow to respond, or because your site is blocking Google.
  • Soft 404: this occurs when your server returns a real page for a URL that doesn’t actually exist on your site. You should replace these pages with 404 (Not found) or a 410 (Gone) return codes.
  • Not found: these are all your 404 pages that occur when a Googlebot attempts to visit a page that doesn’t exist (because you deleted it or renamed it without redirecting the old URL, etc.) Generally 404 pages are fine and won’t harm your rankings, so only pay attention to the ones related to high-ranking content.

Crawl Stats

This section shows the progress of Googlebots crawling your site in the last 90 days.

Search Console Crawl Stats

You can see how fast your pages are being crawled, kilobytes downloaded per day and average time spent downloading pages on your site.

Spikes are perfectly normal, and there’s not very much you can do about them. But if you see a sustained drop in any of these charts then it might be worth investigating to see what’s dragging it down.

Fetch as Google

Here you can check how any page on your website is seen by Google once its been been crawled.

You can also submit these webpages for indexing. You may find this is a quicker way to be crawled and indexed then if you were to let Google find the page automatically.

Search Console Fetch as Google

  • When you ‘Fetch’ a page, Google will simulate a crawl and you can quickly check any network connectivity problems or security issues with your site.
  • ‘Fetch and Render’ does the same as the above, but it also lets you check how the page itself looks on mobile or desktop, including all resources on the page (such as images and scripts) and will let you know if any of these are blocked to Googlebots.

Remember the crawler is meant to see the same page as the visitor would, so this is a good way to get a direct on-page comparison.

If the page is successfully fetched and rendered, you can submit it to the index. You are allowed 500 webpage fetches per week, but you can only submit a webppage and have Google crawl ALL the pages linked within it, 10 times per month.

robots.txt Editor

A robots.txt file placed within the root of your site, is where you can specify pages you don’t want crawled by search engines. Typically this is used because you don’t want your server overwhelmed by Googlebots, particularly if you want them to ignore script or style files, or if you want certain images not to appear in Google Image Search.

Here is where you can edit your robots.txt and check for errors. The bottom of the page reveals your errors and warnings.

robots.txt editor search console

Sitemaps

Sitemaps are hosted on the server of your website and they basically inform search engines of every page of your site, including any new ones added. It’s a good way to let Google better crawl and understand your website.

Here’s where you can access all of the information about any sitemaps either submitted manually or found by Search Console. The blue bar represents pages or images submitted, the red bar represents actual pages and images indexed.

sitemaps search console

You can test a sitemap by clicking the ‘Add/Test sitemap’ button, and if it’s valid you can then add it to Search Console.

URL Parameters

As Simon Heseltine has previously commented, this section isn’t used much anymore since the introduction of canonical tags.

However you should use URL Parameters if, for instance, you need to tell Google to distinguish between pages targeted to different countries. These preferences can encourage Google to crawl a preferred version of your URL or prevent Google from crawling duplicate content on your site.

URL parameters

Security Issues

Although any security issues will be communicated with you in the Messages section and on the Dashboard screen, here’s where you can check on problems in more detail.

Search Console Security Issues

There’s also plenty of accessible information here about how to fix your site if it’s been hacked or been infected with malware.

Other Resources

Here’s where you can access all the tools provided by Google, outside of Search Console. Including the Structured Data Testing Tool and Markup Helper, which we went into greater detail about in earlier sections.

Search Console Other Resources

Other helpful resources here are the Google My Business Center, where you can use to improve your business’s local search visibility and the PageSpeed Insights tool, which will tell you exactly how well your site is performing on mobile and desktop in terms of loading time, and how to fix any issues.

]]>
https://searchenginewatch.com/2016/05/09/google-search-console-a-complete-overview/feed/ 0
SEO for E-Commerce Websites https://searchenginewatch.com/2015/09/02/seo-for-e-commerce-websites/ https://searchenginewatch.com/2015/09/02/seo-for-e-commerce-websites/#respond Wed, 02 Sep 2015 12:34:00 +0000 https://www.searchenginewatch.com/2015/09/02/seo-for-e-commerce-websites/ When dealing with an e-commerce website, there are several things that you’ll want to pay particularly close attention to in terms of SEO.

Your WordPress website is not going to have the same challenges as your IBM Websphere website. With e-commerce, you are dealing with a litany of areas where things can go wrong. Hopefully this column will help you avoid some of the pitfalls that often come from trying to optimize for e-commerce.

Keep in mind that this is not an exhaustive list of everything to watch for in the SEO of an e-commerce website, but these elements just happen to represent some of the more common things that I’ve come across.

Thin or Duplicate Content

For many retailers, if a manufacturer provides standard copy for product descriptions, they’re likely to use it. The more deadly sin occurs when the retailer makes no effort to work on copy across other areas of the website – namely category pages, shareable blog content, video content, and so forth.

We are currently in the midst of a Panda update that we hear is “slowly” rolling out. This is the first update since last September. Imagine how it must feel to find yourself not doing as well as you could because of thin or duplicate content, then rewriting your content, only to have to potentially wait a year to regain some Google-love for your content pages. If you’re not sure if you have an issue here it’s probably prudent to address it.

The duplicate content piece gets a little more interesting. Are you duplicating content across multiple categories and pages on your own website? Are you using other domains to merchandise the product under a separate brand and with the same content? Are affiliates scraping your content? These are all things that you will need to consider.

There are many tools available for checking duplicate content. Since SEM Rush is one of my favorite tools, I typically just end up using its Audit feature to review for concerns.

mark-jackson-september-audit-chart

If you feel there may be an issue with external duplicate copy, it doesn’t hurt to utilize a tool like Copyscape to see how many other retailers or manufacturers may be using the same product descriptions.

Schema

Sometimes it feels as if Google and Bing are dragging us to our wits end with hundreds of algorithmic “best practices” to adhere to. They actually reward us for some of these demands with a richer search result display, which is nice. This is seen in the use of Schema markup format in your page source code. While the page display will not change for your users, search engines will enjoy digesting content in a code markup that is easy for them to understand. Your reward is the addition of product price and availability information in search results.

One word of caution: make sure that your product prices are better than the competition ranking alongside of you that feature the same markup display. Schema.org provides more info for creating this markup. There are even tools to assist in checking if your schema markup is throwing errors. If you feel you may need some help in coding these formats, you can also enjoy assistance in schema coding from Google.

Use of Canonical Tags

Canonical tag usage is important for e-commerce in particular when products have attributes such as various sizes, colors, quantities, and so on, and you want to keep the “SEO goodness” applied to one version of that page. It’s also important that you don’t accidently create a script which automatically associates a canonical tag “back to every individual page.” This is often used to prevent issues related to scraping, so that the “content love” always resides towards the original author’s work.

I say this because I have seen instances of pagination on e-commerce websites where each page of results has a canonical to itself. Hence, you will end up with many versions of the same page indexed in Google. Rather than have these pages “battle it out” for a ranking, have them work more cooperatively to keep the SEO love associated with one version of the category or page.

Google highlights a preference for the “view all” option of your category pages to be used as the canonical for these, should they be available. This is also beneficial when pagination is used for easier shopability.

Pagination

Something that I honestly hadn’t paid much mind to, until very recently, was the use of “previous” and “next” (prev/next) in pagination. When you have five pages of “results” for a given category of a product within your website, and you are not utilizing the prev/next snippets, you are essentially not showing Google the depth of your product offerings.

For example, with 10 results on a given page and five pages of results, you actually have 50 products under a given category. Without properly coding the prev/next, you are showing Google that your category depth is only 10 products deep. When Google is trying to determine who to rank for a given query and everyone else seems to have so much more product available under a given category, you can bet who the loser is going to be.

Code Samples:

  • Base Category Page: Just shows the rel=”next” tag and canonical tag
    • < link rel=”canonical” href=”http://www.your-domain.com/category/all” />
    • < link rel=”next” href=”http://www.your-domain.com/category/?page=2″ />
  • 2nd Page of Category Results: Shows rel=”prev”, rel=”next and the canonical tag
    • < link rel=”canonical” href=”http://www.your-domain.com/category/all” />
    • < link rel=”prev” href=”http://www.your-domain.com/category/” />
      • Notice this does not include the ‘?page=1′ URL as that is equivalent to the base category URL which typically does not natively include a ‘page=’ value
    • < link rel=”next” href=”http://www.your-domain.com/category/?page=3″ />
  • 3rd Page of Category Results: Also include the complete set of all 3 of these ‘rel=’ tags
    • < link rel=”canonical” href=”http://www.your-domain.com/category/all” />
    • < link rel=”prev” href=”http://www.your-domain.com/category/?page=2″ />
    • < link rel=”next” href=”http://www.your-domain.com/category/?page=4″ />

This pattern management is then continued for all subsequent pages.

Merchandising and Category Pages

Category pages are where e-commerce websites live – or die. People are most likely not searching specifically for the name of your product. They are searching generic phrases like “school desks,” “women’s red dress,” or “birthday gifts for him.” Often the same products are merchandised across several category pages.
Using keyword research, determine how your audience is searching to build out the pages and merchandise your products, so you have content that relates to how people search.

Do you sell products that are popular for their manufacturer ID? If you are a B2C marketer, many Web users may search for a common product by the popular model version. For B2B marketers, a plant manager may look at a broken manufacture ID before heading to a keyboard. This can help to set you apart from the retailing competition.

URLs

Very commonly, the larger e-commerce platforms will bring along a clunky plan for how they generate URLs. Today search engines are much better at crawling and indexing, and platforms are much better at generating clean URL structures. While this is not nearly the issue that it once was years ago, it’s still something to be mindful of.

I’ve seen instances in which pagination ended up creating separate URLs that were each indexed. This resulted in duplicate content. Though not confirmed, it’s quite possible that the system also created the self-canonical tags, which I spoke of earlier.

Site-Search

Site-search is a big thing for anyone, but in particular for e-commerce. Understanding how people search for your products can not only help you with your paid search (PPC) efforts, it can also help you better understand the pages that you may want to build and how you may want to merchandise your products. Scalable Link Interface (SLI) uses learning search technology to take site-search behavior and help dynamically build pages for the long-tail of search. But whether or not you utilize a third party site-search product, you want to make sure that your site-search queries do not end up in the search engines’ indexes. A robot.txt exclusion should be all that is needed to facilitate this.

Move to HTTPS

Many e-tailers are moving their websites to HTTPS. While perhaps not a bad idea for the long-term, the jury is still out on the near-term benefits. If you do this, there are some things that you’ll need to be mindful of: do not forget to update your canonical tags from HTTP to HTTPS. You will also want to create a new Google Webmaster Tools profile under HTTPS and submit your new HTTPS sitemap(s).

Sitemaps

Update your sitemaps if you should elect to move to HTTPS. Aside from this, an e-commerce website will want to keep a fresh feed of pages or URLs for your XML sitemap.  Simplify maintaining your site by sectioning sitemaps based on the website’s categories. Doing so can help you to quickly identify issues within a particular area of the site. Also you should have separate sitemaps for video, images, and mobile, if you aren’t already doing so.

Affiliates and Content Scrapers

If you have an affiliate program, keep a close eye on what these folks are doing and make sure your affiliate platform and SEO vendor are doing the same. You give people enough incentive, and it’s amusing to see how aggressive they get in order to earn a commission check. I’ve witnessed affiliate links on porn sites, numerous instances of my clients’ content being scraped, and even an instance of an exact duplication of my client’s website being hosted on a near-match domain. A robust and well-managed affiliate program can drive sales and be a valuable asset in your overall marketing arsenal, but always remember that they don’t work for you – they work for themselves.

This is what drives some of them to devise interesting schemes through which they can increase attributable sales and ultimately their commissions. Just like a team of independent sales folks, they need to be trained in what they can and cannot do and say in regards to your brand. You also must be clear about what your brand considers as “out of bounds” when it comes to promoting you.

Some key things to remember when running your affiliate program:

  • Make sure you have detailed terms and conditions that outline what is and is not allowed. This provides you recourse not to fulfill commissions should a publisher be caught in violation.
  • Use trusted affiliate networks and be sure to inquire about their network quality standards – especially if managing in-house. Remember not every affiliate platform is created equal.
  • Affiliates can affect SEO. This depends on how they link to your site and how aggressively they go after intercepting traffic around your brand and trademark. While you cannot specify the positioning for organic listings, you can regulate how and where, as well as the text used, when a publisher promotes links to your website. You can also provide restrictions of active marketing like PPC, particularly when for trademark or brand assets.
  • Be sure to regularly provide updated promotional digital assets to your affiliate publishers and maintain a standard of communication. Failing to do be proactive in this can result in motivated affiliates doing it on your behalf.
  • Terminate relationships with affiliate publishers if they are caught violating your terms and conditions. Not doing so even after a minor infraction can signify to publishers that they can get away with anything, which can cause more issues in the future.

Lots of Time on Cross-Platform Compliance

Being able to watch a recording of an actual visit can sometimes uncover where elements of a page are breaking and provide insights on how to best enhance the user experience. One of my favorite usability tools is Lucky Orange: it can allow you to determine why conversion rates may have hit rock-bottom for a B2B client’s landing page in a particular browser or platform. Also its Form Field Analytics lets you assess the checkout process to see its flaws, which can help to increase conversion rates and decrease cart abandonment.

This article may not be a complete check-list of everything that goes into an SEO effort for e-commerce, but it highlights many of the things that happen to be top-of-mind for me at this very moment. I would love any feedback on the biggest issues that you’ve been faced with in e-commerce SEO – please share via the comments section below!

Mark Jackson is the President/CEO of Vizion Interactive, a digital marketing agency specializing in SEO, PPC, LLM (Local Listing Management), and ROI. Mark entered the digital marketing fray with Lycos in early 2000 and bootstrapped Vizion in 2005.

]]>
https://searchenginewatch.com/2015/09/02/seo-for-e-commerce-websites/feed/ 0
How to Clean Up Unnatural Links from Freehost Microsite Spam https://searchenginewatch.com/2015/08/12/how-to-clean-up-unnatural-links-from-freehost-microsite-spam/ https://searchenginewatch.com/2015/08/12/how-to-clean-up-unnatural-links-from-freehost-microsite-spam/#respond Wed, 12 Aug 2015 10:30:00 +0000 https://www.searchenginewatch.com/2015/08/12/how-to-clean-up-unnatural-links-from-freehost-microsite-spam/ In the past, freehost microsite spam links were commonly used to build effective links to a site. This type of linking is definitely against Google’s quality guidelines; we know that Google can see these, and given a manual review, they treat them as unnatural.

What we don’t know is whether or not the Penguin algorithm can find this kind of link and use it to lower the level of trust in which Google has in your link profile. My gut instinct from reviewing the link profiles of hundreds of sites is that the current iteration of Penguin is not terribly good at devaluing this kind of link. However, there is a good chance that when Google eventually updates the algorithm, many sites will not only see a drastic drop in rankings, but will find themselves suppressed by Penguin until they clean up and go through another refresh.

What is Freehost Microsite Spam?

There are a hundreds of webhosts, such as WordPress and Blogspot, that will allow you to create a website for free. A common SEO tactic in the past would be to create several of these free websites and then link back to your main site using keyword-rich anchor text.

Here’s a fictitious example of a Toronto Lawyer who is using this technique in order to rank well:

freehostspam

Each of these websites would contain one or more short articles that link back to the main site using a keyword. In some cases, the spam would be more elaborate where the freehost sites could link to each other, as well, in order to create sort of a link wheel. And in some cases, the SEO company who created these links would link to the freehost microsites from high PageRank sites that were obtained by buying up expired domains.

My goal here is not to tell you how to create freehost spam. Trust me. Google is doing all it can to catch any tricks that are helping people rank unnaturally. If your main site is one that you can’t afford to have penalized or algorithmically demoted for months and months, then you don’t want to risk trying this tactic.

But what do you do if you have this type of link and want to get rid of them?

How to Find These Links

Although Google has given me this type of link as an example of an unnatural link many times, it has also given me freehost microsite spam links unlisted by Webmaster Tools or any of the known backlink checkers. For example, for one site, we removed links like this:

Torontolaywer.blogspot.com
Torontolawyer.weebly.com
Torontolawyer.wordpress.com

But it turned out that Google was also seeing these sites, which we could not find in our compiled list of backlinks:

Torontolawyer123.blogspot.com
Torontolawyer49.weebly.com
Torontolegaladvice.wordpress.com

In this case, a former SEO company had made thousands of freehost microsites in order to build links for their client. We were able to find about 10 percent of those using Webmaster Tools, Ahrefs, Majestic and Open Site Explorer.

If you know this practice has been used in the past, here are some ways that you can find these:

  1. Ask your SEO company for a list. I’ve had some clients that have been able to contact their previous SEO company and ask if they had a list of sites that they had created. For some of these, the previous SEO company was able to provide us with a spreadsheet containing login info for each freehost. We were then able to log in to the majority of these sites and disable the microsite.
  2. Check your Google Analytics Referral Traffic. Click on “Acquisition > All traffic > Referrals” and export as much as the data as you can. You can scour the list of referrals to your site to see whether you can find any possible freehost microsite spam. Of course, not all of these microsites will have generated clicks to your site, but some of them may have.
  3. Do some crazy spreadsheet work. This option is complicated, but it works well. The idea is to create a list of known freehost microsites, a list of possible keywords that would have been used to create these sites and then concatenate the lists to generate one big list of possible microsites that would have been created. You can then use a tool like Scrapebox to crawl each of these domains to see if they exist and whether they link to your main site.

I don’t expect many of you to use No. 3, but it’s worth trying if you know that you’ve got a massive mess of microsites and you can’t find them all.

How to Get Rid of These Links

If you’re not dealing with a manual penalty, an easy option is to simply disavow all of these on the domain level. In a previous article, I wrote about whether or not we should be removing or disavowing links for Penguin recovery or prevention. My philosophy is to remove what I can easily remove and then disavow the rest. Many of these freehost microsite spam links are ones that you can easily remove.

Many of the freehosts will have a place where you can report spam. Weebly is awesome for this because they allow you to report spam sites on their platform in bulk. Other freehosts like Blogspot and WordPress will only take one URL per form submission so it can take some time to report all of your spam sites if you have quite a few. Again, this is a good task to outsource. These sites will sometimes be removed quite quickly, but other times, unfortunately, these requests appear to get ignored.

If I am removing these links for a site with a manual penalty, I’ll take screenshots of my form submissions when reporting these sites as spam. I’ll share these with the webspam team, telling them that we couldn’t contact the site owner for these, as they are spam, but that we were indeed taking measures to remove the sites. Of course, in these situations, if I am actually able to log in to the site and remove it, I’ll do that instead of reporting them as spam.

And then, I disavow. Make sure you disavow on the domain level. If you know you have a whack of spam from one freehost, you can disavow that whole domain. For example, you can include this line in your disavow file to disavow all of your spam from any Weebly subdomain:

domain:weebly.com

Why Should You Be Cleaning These Links Up Now?

This type of spam linking is very common and used to work well. I believe that Google is getting better at discounting this type of link, but I still see a lot of sites for which it appears to be working. Those site owners are often reluctant to remove those links because they may be currently helping.

But if I am correct and the next update of Penguin is able to find these links, you could be in serious trouble. One could argue that you could simply remove those links. However, if you are hit by Penguin, your rankings will be suppressed until the next Penguin update, which could take a very long time. Even then, there is no guarantee that you will recover even if you do a good cleanup.

What are your thoughts on freehost spam? Do you think Penguin will target this type of link in the future? Do you have any additional tips to help site owners deal with these? Leave a comment below.

]]>
https://searchenginewatch.com/2015/08/12/how-to-clean-up-unnatural-links-from-freehost-microsite-spam/feed/ 0
The Perils of Negative SEO: What You Need to Know https://searchenginewatch.com/2015/08/11/the-perils-of-negative-seo-what-you-need-to-know/ https://searchenginewatch.com/2015/08/11/the-perils-of-negative-seo-what-you-need-to-know/#respond Tue, 11 Aug 2015 11:30:00 +0000 https://www.searchenginewatch.com/2015/08/11/the-perils-of-negative-seo-what-you-need-to-know/ We’ve been talking about Negative SEO since the advent of Penguin and many still wonder whether it’s a real threat. The lore is that Google will protect websites from losing traffic due to negative SEO, but is this the truth, or a just myth propagated by Google?

A few days ago, I received an email in my inbox:

negative-seo-email1

negative-seo-email2

Yes, you read it right: large volumes of spam and fake links to destroy your competitor’s rankings. Why spend a fortune on things like content marketing and link attraction, when you can surgically remove your clients with Negative SEO?

The Birth of a New Industry

Think about it: what happened to the thousands of individuals who used to make a living offering cheap link-building services with 10,000 directory submissions and 500 article submissions? Since this black hat approach simply doesn’t work to rank sites anymore – in fact, the effect is the opposite – why not use the same skills for a different product or service?

There are entire companies that have survived by relying on these tactics. Instead of trying to compete in the “content marketing” game, where the ante has been seriously upped, why not stay in their game by obliterating the competition?

You can find dozens of these gigs on Fiverr – as well as many individuals on sites, like freelancer.com and upwork.com – who offer these services. And for those who used to be in the black hat game, firing up software to drop blog comments, forum and profile links doesn’t take much and costs next to nothing.

What Sites are Protected?

As an experienced Penguin Audit Analyst who’s conducted hundreds of audits and penalty recovery campaigns, I’ve learned to identify sites that are vulnerable to attacks. Generally, sites that can sustain Negative SEO attacks have old, established link profiles with enough volume and authority not to be negatively impacted. Large retail brands, old informational sites, niche authorities: these sites can survive most Negative SEO attacks without experiencing a significant loss of traffic.

Other sites, such as those with few links or not many with authority, can be easily impacted by Negative SEO attacks. And unless the site owners are carefully monitoring their link profiles, including monthly link audits and reviews, they may not know until it’s too late and a Penguin penalty or a manual penalty has already been applied.

Recovery from penalties is difficult and time consuming. Even if you jump through all the hoops, your site still may not recover to its pre-penalty days.

How Can You Protect Your Site from a Negative SEO Attack?

Start by monitoring your link profile carefully. Use a variety of tools – Google Webmaster Tools, Bing Webmaster Tools, Ahrefs, Majestic, Moz – to keep track of all of your inbound links. Compile all of your links from all of these sources into one list, using Excel formulas to dedupe, and compare with a previous list to identify new links gained.

Once you have a list of new links acquired, review these links to determine the quality. Are these links healthy? What is the Domain Authority, SEMrush rankings, Trust and Citation Flow? Review various quality markers for each of the sites to determine if they are healthy or not.

Are the sites using keywords in anchor text? If you didn’t build these, then it’s obvious that you are being targeted for a Negative SEO attack.

Using the Disavow Tool

By monitoring your profile carefully and tracking at least every every two to four weeks, you can catch an attack the moment it starts, and update your Disavow file with problematic sites. If you catch them quickly, you can mitigate the possible impact of those sites on your rankings. It’s important to add any potentially suspicious sites, as every low-quality site or anchor text keyword can offset your ratios and trigger a penalty.

Don’t Be a Victim

When I received that email, I knew that thousands of other individuals must have, as well. And as much as we’d like to believe everyone have the best of morals, the reality is that some people will be infinitely attracted to this type of proposal. If one of your competitors takes up a Negative SEO strategy, the results to your business could be devastating.

Do you have any tips on how to keep your site safe from Negative SEO? Do share!

]]>
https://searchenginewatch.com/2015/08/11/the-perils-of-negative-seo-what-you-need-to-know/feed/ 0
3 Tips for Understanding Your “Courtship” with Search Engines https://searchenginewatch.com/2015/07/13/3-tips-for-understanding-your-courtship-with-search-engines/ https://searchenginewatch.com/2015/07/13/3-tips-for-understanding-your-courtship-with-search-engines/#respond Mon, 13 Jul 2015 12:30:00 +0000 https://www.searchenginewatch.com/2015/07/13/3-tips-for-understanding-your-courtship-with-search-engines/ One of the few difficulties in SEO I often encounter is explaining the process of site optimization to those who do not understand why it’s necessary and how it relates to your overall marketing strategy. Taking it a bit further, this can even apply to conveying the principles of SEO to others.

Whether you are trying to educate people across the aisle in your marketing department, win the sale with a prospective customer or even tell your mother what you do, it helps to portray the core concepts of SEO in a manner that parallels daily life. I’ve found the easiest way to encounter this is with lateral correlation to dating.

Now, granted, I have been out of the dating scene for 12 years now – thank you, Jennifer, for accepting my proposal – but SEO and dating are quite similar and provide an easy explanation. For a search marketing veteran, take this as a fun and handy tip; for the SEO beginner, take this as a lesson.

Indexation and Information Architecture

It’s the first date and if you are lucky to get past awkward silent moments, you begin to gain an understanding of the other person’s life. If you find that they don’t seem to answer questions completely, or if they’re extremely vague in correspondence, it may become difficult to understand the true composition of who they really are. After a good long chat with the other person, you should easily understand what is important to them and what they value. Inversely, if you have any skeletons in your closet or personal traits that are best left unknown, it may benefit you to keep it withheld.

Relationship to SEO

How you are indexed by search engines is the initial first date. Putting it all on the table allows the search engines to take into account everything that makes your site unique, and what keywords you will soon be deemed relevant for. As you learn what your date values and deems important, you should understand how you structure navigation and the internal linking within your site. On the other hand, there are many times when you have issues with your site where you have content that you don’t want search engines to see or crawl.

Make Sure That You…

  • Provide an XML sitemap of all content on your site that you want indexed and visible to crawling.
  • Structure content on your site by importance with supporting content in a clean folder structure. Internally linking is easily crawled and anchored descriptively.
  • Robots.txt file usage should be employed for known duplicate content, non-search critical content and anything you don’t feel is necessary to convey to search engines.

seocourtship1

Content/Keywords

You’ve made it past the first date, but you are far from marriage. Your inquisitive mind wants to know more about what makes this other person tick. You inquire about their music tastes, what movies they like and even what media outlets they frequent. Is it easy to understand who they are? You may have been in their home, seeing it’s organized – or how it isn’t. By this point, you’re seeing a pretty clear picture of how we can categorize this person and if they are quite similar to you. You should be able to sum this person up in a few distinct keywords.

Relationship to SEO

In the previous section, we began to open up the door and let search engines see what we wanted them to know about us, as well as the hierarchical categorization of what our site represents. We also withheld any information we didn’t want to initially reveal. Now, we must instruct the specifics of who we are. We do this with the content on our site. We do this in our keyword focus. We provide different content for each of the different types of users we are trying to service.

Make Sure That You…

  • Create text, image and video content specific to keyword topics to form relevance for ranking.
  • Don’t pigeon-hole content on your site. A quality site contains a breadth of content in several different mediums. Provide resourceful content – such as a mix of articles, blog posts, videos, whitepapers – rather than writing 1,000 blog posts to chase keywords.
  • Understand your keyword composition as Google sees it. In Webmaster Tools‘ Content Keywords section, you can see, in aggregate, what terms you portray the most on your site and where these terms are prevalent.

seocourtship2

Link Neighborhoods and Authority

You are falling in love with this person. You like everything about them. Next, you will meet their family and friends. Unfortunately, this is where things could go south. Finding out they spend most of their personal time with their high school pal, who is a drug dealer, might not suit you. On the bright side, they may have many friends of reputable or authoritative character, which can vouch for your prospective soulmate as an upstanding person.

Relationship to SEO

I call link-building and link-earning the right leg of SEO: you have to have it. Where it was once about quantity, over the past few years, it has become much more about quality. If you have a link profile makeup consisting of links from less than reputable sources and topically unrelated sites, you are sending a message to search engines that you are also not a reputable site. In correlation to real life, typically you are who you hang out with.

Make Sure That You…

  • Only associate yourself in link-building relationships with quality sites. This applies in who links to you, as well as who you link out to. This is a process that needs continual monitoring.
  • Don’t try too hard to attain links. If you try too hard in link building, as with your personal life, it’s usually easy to see, as it looks unnatural. The best links are those attained through creating resourceful content.
  • Be relevant; links from topically-irrelevant site make you look very shady. Continually monitoring the topical theme makeup of your linking sites to ensure you are portraying the proper image search engines of who you are by who you associate with.

seocourtship3

Some may find this example a little corny, but I find it to be an accurate comparison between SEO and real life. At the end of the day, SEO is the process of creating a relationship between two parties where you want to completely convey your entire makeup, as well as why you are a reputable, authoritative entity. Whether you now want to call yourself a “search marketing Casanova” or a “Google Matchmaker”, that is up to you!

]]>
https://searchenginewatch.com/2015/07/13/3-tips-for-understanding-your-courtship-with-search-engines/feed/ 0
Are You Making These 4 Additional Silly SEO Mistakes? https://searchenginewatch.com/2015/06/15/are-you-making-these-4-additional-silly-seo-mistakes/ https://searchenginewatch.com/2015/06/15/are-you-making-these-4-additional-silly-seo-mistakes/#respond Mon, 15 Jun 2015 10:30:00 +0000 https://www.searchenginewatch.com/2015/06/15/are-you-making-these-4-additional-silly-seo-mistakes/ In May, we reviewed some of the more common “silly” mistakes that digital marketers make when they become consumed in the constantly evolving world of SEO. With so much to do within our daily digital responsibilities, it can become quite easy to overlook the simple things. While we only mentioned three last time, let’s pick up where we left off last month and review more of these mistakes.

4. Title Element Negligence

While there are over a few hundred factors that go into a Bing or Google ranking algorithm. The title element has and still is a very fundamental and important element of SEO. The most explicit issue I often see is the lack of keyword focus. This area of a site page should not be keyword stuffed but in the least should carry a unique topical representation of what the page is about. Down the road, with the help of in-depth keyword research, you will be able to get a feel for what terms you should target. But for now you need to at least create some synergy between title elements and the topical theme of the page.

The implicit issue I often see that isn’t so easily noticed is title element duplication across entire sections of a site. I call this implicit because many times title element focus is noticed and adjusted for top pages such as the homepage, service, product pages, etc. But internal (still search-critical) pages have been overlooked. Common issues seen are Blog section, whitepaper, press release and resource section pages simply carrying title elements of, ex. Press Releases and other section names instead of title elements crafted after the topic or title of the page.

What do I do?

Again, use a tool that will scrape all site pages and give you a quick view of the title elements across your site.

title-element-negligence

5. Unknowing Duplication of Content

I have talked in detail about issues that are easily seen and those that are not. Duplicate content is one of those areas that typically is not as easily recognized. Specifically, the type of duplicate content that I am focusing on here is any content that you are providing that can be found on two URLs. There are different types of duplicate content but those can be tackled down the road. These include the duplication of copy from another site onto your site.

My concern is that you may unknowingly feature content on two different URLs such as a page found at a page extension and with no extension, i.e. /index.html vs. no extension. Duplication is also found with parameter usage. Do your pages feature filter or sort functions that present basically the same content and are in need of pagination or canonical tags? You may wonder why eradicating this duplication is important. It is a sign of sloppiness to a search engine, it requires more crawl spend for a crawling bot to crawl duplicated content and you may be accruing links to two different URLs of the same content, thus splitting your link equity.

What do I do?

To get a feel of what may be in the search results, review your organic landing pages in Google Analytics and walk out several hundred pages to see all of the pages that refer traffic.

unknown-content-duplication

6. Improper Internal Linking

When it comes to links, typically the major focus is directed towards inbound links coming into your site from other sites. While it is a very important part of SEO it isn’t the only type of linking to consider. How we link internally across site pages helps search engines to understand what we deem as the most important pages of our site. Where many begin to confuse search engines and impede their understanding and crawl of the site are when the following errors occur:

  • Linking the brand logo and homepage linking across the site to a duplicated version of the homepage (or internal page), ex. example.com/index.html vs. the absolute version of the page.
  • Internal links on site pages or navigation are broken and showing a 404 status.
  • Possessing an internal link structure where you have placed non-important links in footer and main navigation so that they are linked more across the site instead of other important, search-critical pages.

What do I do?

Review your Google Webmaster Tools account in the Internal Links section to understand what site pages Google sees as the most linked to content across the site. Also, review the Crawl Errors section here so that you can gain clarity on error pages that Google is finding and where they are originating from. You may find out that you have broken links in many places across your site. Aside from this research, there are many other tools out there that can identify broken internal links.

improper-internal-linking

7. Wait, a Seventh Mistake?

Yes, as I am thinking about the silly mistakes many make on websites – I can’t leave this topic without expressing the error I see very, very often as of recently. I know you may think that social does not have much to do with SEO. However, keep in mind if your content is shared widespread across social properties, it provides social attention factors to search engines as well as opportunities for inbound links.

The social element that many flub that drives me crazy is Open Graph tagging. Open Graph tags are a great way for ensuring that a proper representation of your site page will be posted socially but this area is often neglected. What I often find is that an Open Graph image is not given, or images are simply a brand logo. For instance, a blog post of enticing title and theme alongside an image of brand logo can reduce enticement real quick. An additional example includes titles that are often led off with the name of the site category that will not read well in the social display.

What do I do?

Utilize a tool and review a sample page of your site you might desire to be shared socially to gain an understanding of what the display will look like. Don’t have time for that? Go to that page on your site and attempt to share it. How does it look?

open-graph-testing

Conclusion

Yes, I know, SEO is a never ending process of adjusting to algorithmic trends, taking advantage of keyword and content opportunities and ensuring that you are easily crawled and indexed. However, before you start or even if you are currently absorbed in an SEO effort, it pays to ensure you are not making these silly mistakes that could impede organic search bliss.

]]>
https://searchenginewatch.com/2015/06/15/are-you-making-these-4-additional-silly-seo-mistakes/feed/ 0
Every Second Counts: Why Page Speed Should Be Your Next Focus https://searchenginewatch.com/2015/05/22/every-second-counts-why-page-speed-should-be-your-next-focus/ https://searchenginewatch.com/2015/05/22/every-second-counts-why-page-speed-should-be-your-next-focus/#respond Fri, 22 May 2015 12:00:00 +0000 https://www.searchenginewatch.com/2015/05/22/every-second-counts-why-page-speed-should-be-your-next-focus/ If you have already invested in making your website mobile friendly for Google’s latest algorithm update, then you are ready for the next step: optimizing your pages to load as quickly as possible. Otherwise, all your hard work toward gaining the top rankings in desktop and mobile search will be in vain.

How We Know Page Speed Matters to Google

In 2010, Google announced that site speed is a factor they take into account for search rankings. Since this revelation, they have introduced specific tools for webmasters to analyze and improve their page load times.

First, there is the PageSpeed Insights tool. Enter your website URL and it will show you your speed score for both mobile and desktop devices, plus suggestions on how to fix each of the elements that are slowing your website down.

pagespeedinsights

Then there are the Site Speed reports inside Google Analytics. These reports, in the Behavior Section, will show you average times for page load, redirection, domain lookup, server connection, server response, and page download.

site-speed-google-analytics-reports

You will also find detailed reports about your site speed in relation to specific browsers, countries, and pages.

site-speed-by-page-google-analytics-reports

Google also recommends monitoring your website’s speed performance in the Google Webmaster Technical Guidelines. In addition to their own tool, they suggest using YSlow and WebPageTest.

Why It Should Matter to You

Even if you don’t think that page load affects your Google rankings enough to matter, you should be concerned about it from the user experience side of things. Particularly, studies have shown that:

  • A one-second delay in page-load time leads a drop in page views (11 percent), conversions (7 percent), and customer satisfaction (16 percent), according to the Aberdeen Group.
  • Econsultancy research found that 47 percent of consumers expect to wait no longer than two seconds for a web page to load. Additionally, 88 percent of people who experience a dissatisfying visit due to page load times are less likely to shop from that site and more than a third will tell their friends about the bad experience.
  • According to KISSmetrics, 18 percent of mobile users will abandon a website if it doesn’t load in less than five seconds. If it takes more than 10 seconds to load, 30 percent will abandon the site.

These are the real reasons why you need to ensure that your website loads as quickly as possible. Even if you have the number one ranking in search results for your target keyword, you are wasting all of your marketing efforts if your visitors are leaving due to slow loading times.

Website Hosting

The website hosting provider and technology you choose can have a significant effect on your page load times. Dedicated hosting solutions are preferable over shared hosting so you do not have to worry about other websites on the same server as yours slowing your website down.

Another option is to look for services that optimize the delivery of content. For example, load-balancing services can help by distributing an influx of traffic across multiple servers, maximizing website performance and reducing the load that would normally be placed on one server. You can also use a content delivery network, which will deliver web pages and content to website visitors using a server closest to that visitor’s geographic location.

Website Technology

The technology you use to build your website will also play a major role in the performance of your website. A study by Business2Community showed that Shopify, WordPress and Joomla have the best page load times for ecommerce and content management systems, while Website Tool Tester found Webnote, Yola and Weebly to be the top desktop website builder platforms. Mobile website speed performed poorly for all website builders, scoring 68 or less on Google PageSpeed.

The fastest WordPress themes include Schema, The Foundly and Braxton, according to tests by Colorlib. Compared with Facebook, Disqus, Livefyre and IntenseDebate, WordPress’ base comment system also loaded fastest, Pingdom found.

Website Content

One area that every webmaster can work on to increase website speed is the content. Simple changes, such as using smaller image files, can help reduce the load time of your pages. Look for widgets on your website that might be increasing page load times, such as those that bring in your latest activity from social networks.

You should also regularly review analytics and other tools you use on your website. For example, you might have tested several platforms for analytics, heatmaps, and website optimizers. If you’re not actively using them all, you should remove the code from your website as each one can slow your website.

Google PageSpeed Insights tool can suggest specific ways to fix other issues on your website that might slow your page load, such as minifying website code, leveraging browser caching, and enabling compression.

For further research, you can use BuiltWith browser extensions to see what technology your favorite, fast-loading sites use.

website-technology-used

 

Homepage image via Shutterstock

]]>
https://searchenginewatch.com/2015/05/22/every-second-counts-why-page-speed-should-be-your-next-focus/feed/ 0
Five Ways to Glean Important Insights From GWT’s New Search Analytics Report https://searchenginewatch.com/2015/05/21/five-ways-to-glean-important-insights-from-gwts-new-search-analytics-report/ https://searchenginewatch.com/2015/05/21/five-ways-to-glean-important-insights-from-gwts-new-search-analytics-report/#respond Thu, 21 May 2015 12:00:00 +0000 https://www.searchenginewatch.com/2015/05/21/five-ways-to-glean-important-insights-from-gwts-new-search-analytics-report/ According to Google’s Inside AdWords blog, mobile search has surpassed desktop search in 10 countries, including the United States and Japan. Previously, site owners faced challenges when trying to gather accurate and insightful data to create actionable marketing strategies.

Now, your needs are being met. Google recently launched a new Search Analytics report within Search Console, as Webmaster Tools was rebranded yesterday, to provide more meaningful and authentic reports.

For example, image clicks will now be counted on expanded images, whereas the older search queries report counted clicks on both expanded and unexpanded images. Also, all search links to a single site will be counted as a single impression rather than multiple impressions. Because data will now be counted in a different way than the old Search Queries report, you are likely to see a decrease in your reported search traffic but more accurate data overall.

The Search Analytics report’s immense capabilities allow webmasters to drill down further into a site’s detailed traffic breakdown; users will be able to compare search results by query, landing page, country, device type, search type, and date. You can then use the new data gathered to improve your site’s performance.

Here are five ways to glean important insights from this tool:

1. The Google Update Tracker

The most talked about trend in SEO and search in the last month has been Google’s mobile-friendly or “Mobilegeddon” algorithm update, which was designed to give a preference to mobile-ready sites over non mobile ready sites. Through the Search Analytics report, you’ll be able to compare mobile traffic to desktop traffic before and after the update to see if your traffic has deviated from its typical performance in the the past few weeks following the algorithm update.

Simply choose a preferred date range and cross-compare based on device type (Desktop vs. Mobile). Google will include an update line to show when there are “Data Anomalies” in Search Console to explain why changes have happened.

google-search-analytics1

2. The High-Low Split

Consider this: If certain landing pages on your website have a high number of impressions and are near the top on the search results page but have a low CTR, you should start asking why links aren’t being clicked through and how you can improve your content to satisfy users’ interests. Analyzing which pages have high impressions but a low CTR is imperative for ensuring success. Begin filtering by landing page (for example your /products page) and choosing a preferred date range.

Then, compare impressions against CTR to see which results are being viewed but aren’t driving traffic. Once you have made a list of landing pages, you can further filter for those specific landing pages and see if the low CTR is influenced by the meta description or a low overall position.

google-search-analytics2

3. ‘Tis the Season

Build upon the aforementioned meta description analysis by adopting different versions of meta descriptions to fit events. For example, if an e-commerce website wants to run a promotion for their products before Black Friday, it is essential to compare the metrics for particular landing pages before and after they make changes to their meta descriptions in order to see which of the changes that were implemented are performing better.

When using the Search Analytics report and comparing by date range, a new column titled “Differences” will highlight the exact change in the specified metrics.

google-search-analytics3

4. The Expectation Game

Part of building an SEO-friendly site requires an intimate knowledge of what users search for with regards to your website. Grouping based on queries may seem like a fairly obvious way to gain insights on how users see your site, but it is important nonetheless. Firstly, it is crucial to ensure targeted keywords that you expect to drive users to your site actually do lead to traffic. Secondly, grouping by query allows you to screen for unexpected queries that are sending users to your site. If queries that are entirely unrelated to your site or commonly associated with Black Hat SEO are driving users to your site, it may be time to drill deeper. You can do so by filtering for the unexpected queries and clicking on the pages button to see which pages are being affected.

After grouping by queries you can also track which long tail keywords are getting a decent amount of impressions; these can then be used in content creation for landing pages and blog posts.

5. Branded Insights

Coming up with a brand name and developing it over time is something that all websites face. A related challenge is creating an association between your brand name and the relevant search terms associated with your site. Being able to filter branded vs non-branded search queries will help you to better understand how people are searching for your site. With the new Search Analytics reports, the differences are extremely clear and straightforward. Simply compare two queries (one branded and one non-branded) that are pertinent to your site and filter by impressions and CTR.

The Search Analytics reporting tool is more accurate and more powerful than the old Search Queries tool and is currently the default option in the Search Traffic module of Search Console. Don’t worry though because you will still have access to Search Queries for three more months, allowing users to familiarize themselves with the new reporting tool before they close off access to the older reports for good.

google-search-analytics4

Additionally, by using the Search Analytics tool, users can only look at information from the past 90 days; this prevents webmasters from measuring historic site performance as well as year to year and month to month progress. There’s still the option of downloading all the data and using it to build insights, which allows you to track data over time.

Michael McManus, earned media team lead at iProspect, also contributed to this article.

]]>
https://searchenginewatch.com/2015/05/21/five-ways-to-glean-important-insights-from-gwts-new-search-analytics-report/feed/ 0
Google Unveils Rebranded Webmaster Tools https://searchenginewatch.com/2015/05/20/google-unveils-rebranded-webmaster-tools/ https://searchenginewatch.com/2015/05/20/google-unveils-rebranded-webmaster-tools/#respond Wed, 20 May 2015 16:28:00 +0000 https://www.searchenginewatch.com/2015/05/20/google-unveils-rebranded-webmaster-tools/ In a move to reach out to more users who care about search marketing, Google has rebranded Webmaster Tools as Search Console. The traditional idea of the “webmaster” reflects only some of its users, according to Google. By rebranding Webmaster Tools, the company hopes the product will serve a broad spectrum of users, including hobbyists, small business owners, SEO experts, marketers, programmers, designers, app developers, and of course, webmasters. Google is going to roll out the update over the coming weeks. Today also marks the first official day of Google’s partnership with Twitter. In February, the companies announced that mobile users in the U.S. will be able to see relevant tweets when they search on Google.

]]>
https://searchenginewatch.com/2015/05/20/google-unveils-rebranded-webmaster-tools/feed/ 0
Google Reveals Search Analytics Report for More Refined Data https://searchenginewatch.com/2015/05/06/google-reveals-search-analytics-report-for-more-refined-data/ https://searchenginewatch.com/2015/05/06/google-reveals-search-analytics-report-for-more-refined-data/#respond Wed, 06 May 2015 18:18:00 +0000 https://www.searchenginewatch.com/2015/05/06/google-reveals-search-analytics-report-for-more-refined-data/ Google has introduced the Search Analytics report to help webmasters better analyze their website traffic.

The new Search Analytics report enables users to break down their site’s search data and filter it in many different ways. For example, webmasters can compare mobile traffic before and after the “Mobilegeddon” algorithmic update released on April 21. If they have an international website, they can find the countries where people search most for their brand by choosing “impressions” as the metric, filtering by brand name, and categorizing results by country.

search-analytics-by-country-small

Image Credit: Google

Webmasters can also use the Search Analytics report in many other ways to make the best decisions for their website’s performance.

Google points out that compared to the existing Search Queries report, Search Analytics report offers more accurate data. For example, the new tool counts only clicks on an expanded image, while the old Search Queries report counts any click on an image, expanded or not. So click count may be lower in this new report, but is more meaningful.

Google is rolling out the new Search Analytics report today. It will keep the existing Search Queries report for an additional three months.

]]>
https://searchenginewatch.com/2015/05/06/google-reveals-search-analytics-report-for-more-refined-data/feed/ 0