Fame Foundry - A Charlotte Website Design and Marketing Firm

Thursday, 5th February, 2015 | By Jeremy Girard | Category: Traffic Building

SEO Update: How to Rule the Rankings in 2015 (Part 2)

SEO-glass

From Panda to Penguin to Hummingbird to Pigeon, the ever-changing landscape of SEO looks far different in 2015 than it did five, three or even just one year ago. In this multi-part article series, we’re separating SEO myth from SEO fact to arm you with the information you need to ward off the snake-oil practitioners who are looking to make a quick buck and help you climb the ranks of the search engine results pages the right way.

In our first installment, we took an in-depth look the best practices for using keywords on your website. In this article, we’re broadening our focus to your site’s content, with a definitive answer to the age-old SEO debate about whether a big site with lots and lots of pages (and therefore lots and lots of keywords) will help you secure a better standing than a smaller site with fewer pages.

So when it comes to SEO, does size matter?

Why SEO experts like big sites

For many years, many SEO experts believed that bigger was better when it came to the number of pages encompassed by a website. The logical reasoning behind this theory is as follows: Any given webpage can only be effectively optimized for a small number of specific keywords or terms – maybe two or three at the most. Therefore, to get more keywords on your website, you would need to incorporate more pages into the site, with each one optimized for a couple of specific keyword phrases. In doing so, you can obey the rule of thumb of keeping each page optimized for a maximum of three keywords while conquering a larger number of keywords and phrases spread across your site as a whole.

This logic is sound, and in some ways, this approach can be effective, but the formula for SEO success is not as simple as more pages = better ranking for more keywords. So before you start blowing up your site map, here are three important caveats that you should consider.

More pages mean more work

A larger website means more work for you. First, there is the problem of creating content. The bigger the site, the more pages it will have. The more pages the site has, the more time it will take to develop the content for those pages and ultimately build that site. While this is hardly a shocking revelation, what many people fail to consider when they deploy a larger site is that the added work does not end when the site launches.

Unlike a brochure, a website is a living, breathing entity. It should continue to evolve and change at a rate that keeps pace with your brand and your business. Therefore, you should always keep tabs on your content to ensure that it reflects your latest product and service offerings as well as your most current client experience or body of work.

On a regular basis, you should perform an audit of each page of your site to answer to the following questions:

  • Is the content of this page current? Is there anything that needs to be revised or updated?
  • Is there anything that is no longer relevant and should it be removed completely?
  • Is there anything missing? Is there any new information that should be added?

This process – and the resulting changes that must be implemented once you complete this evaluation – takes time. If you have a very large site, it will take more time than if you have a smaller site. This is absolutely something you should take into account upfront, before you decide to deploy a massive site with lots of pages aimed at boosting its visibility for as many keywords as possible. To keep that site up to date and performing well, you will need to invest considerable time in maintaining it post-launch.

Don’t be duped into duplicate content

Another challenge posed by expansive sites with many pages that are each focused on a handful of specific keywords or phrases is that those sites often stumble unwittingly into the territory of duplicate content. This happens when a so-called SEO expert recommends creating multiple pages for the same product, service or idea, each using slightly different keywords to try to get those pages ranked for all of those phrases.

The problem with this approach is that the resulting pages are nearly identical to each other, which can trigger Google to view the pages as having duplicate content. SEO experts have long debated whether or not Google actively penalizes sites that have duplicate content. However, what cannot be debated is that search engines have become much better over the years at understanding synonyms. This means that even if a specific keyword is not on your site, but a popular synonym for that word is present, your site could still rank for that keyword. Google’s Matt Cutts sums up this evolution in algorithmic intelligence as follows: “Keyphrases don’t have to be in their original form. We do a lot of synonyms work so that we can find good pages that don’t happen to use the same words as the user typed.”

Cutts’ statement essentially nullifies the idea that your site needs lots and lots of pages filled with different, but related, keywords in order to rank for each of those terms. On today’s Web and with today’s search engines, that is simply not the case.

Finding the right customers

The goal of search engine optimization is to ensure that your website shows up when people who need your products or services are actively looking for those products or services. You do not need just anyone to find your site; you need the right people to find you.

You do not need just anyone to find your site; you need the right people to find you.

But what’s the harm in driving as much traffic to the site as possible? The potential downside to bringing a large volume of irrelevant visitors to your site is that you risk wasting your company’s resources trying to serve customers who ultimately will not buy from you.

For example, I have a client who sells industrial dye products. Their customers are other businesses who need dyes for plastics, textiles and other industrial applications. They do not sell to the general public; yet they would routinely receive inquiries through the website from people who wanted to dye their sofa or their car’s interior or their cat (as odd as it sounds, they apparently received this request at least a few times each month from different people). The time it takes to reply to these inquires adds up, and the sales department began to lose faith in the site because they felt that it was generating too many bogus leads and not enough legitimate ones (this was not really true, but that was the perception).

The solution to this problem was to refine the site’s content so that it did not cast such a wide net. At the time, the site’s content was not tailored to the company’s specific target audience; rather, their goal was to appear in the search results for as many keywords as possible and to bring as many leads to the site as possible, which resulted in a disproportionate amount of unqualified leads.

While the sheer number of inquires that came in from the site dipped once we adjusted their SEO strategy, the overall quality of those leads improved greatly once they scaled back their page count and turned their focus instead to driving the right customers to their site.

What’s the magic number?

So how many pages should your site contain? There is no magic number. Instead, your site should contain exactly as many pages as are needed to effectively accomplish your business objectives.

Once you start adding pages solely because you think they will boost your standing with the search engines, you are heading down a slippery slope. Your best bet is to create a highly user-friendly site with top quality content and a streamlined navigational structure built around serving the needs of your customers, not trying to manipulate the search engines. The result will be a site that over time will become popular with both your customers and the search engines and that you will be able to effectively maintain, rather than an artificially bloated site that is a hassle to navigate, a burden to manage and is unlikely to give you any advantage with the search engines anyway.

Jeremy Girard
Jeremy Girard has been designing for the web since 1999. He is currently employed at the Providence, Rhode Island-based firm Envision Technology Advisors and also teaches website design and front-end development at the University of Rhode Island. In addition, Jeremy contributes regularly to a number of websites and magazines focused on business and the Web, including his personal site at Pumpkin-King.com.