Questioning the Effectiveness of Keywords in URLs

By Deane Barker on December 29, 2006

There’s an accepted theory in SEO: put keywords in your URLs. This is so accepted, that no one questions it and content management systems routinely have modules, extensions, and allowances for users to create keyword-rich URLs.

But, does this work? Does anyone know for sure? I’ve been casually looking for a while for resources which prove or disprove the effectiveness of this theory, but I haven’t found much. Does this emperor have any clothes?

Years ago, I read Shari Thurow’s book “Search Engine Visibility” and she said this:

Simply placing keywords in a domain name and/or filename is not going to make or break top search engine visibility. […] Keywords in the domains names and filenames are not as important as people are led to believe. [..] Keywords in a domain name give minuscule boost when all other factors (text, link, and popularity component) are equal.

I agree with this — I feel like the emphasis on this SEO technique is way, way over-rated. There are so many other factors people should worry about before they start aliasing pages. Put another way, keywords in URLs might be a small part of a over-arching SEO strategy, but they’re not worth much by themselves.

(Now, don’t confuse this with an indictment of clean URLs. I like URL cleanliness, which we’ve talked about quite a bit around here. I think clean, short URLs have a distinct usability benefit. Hence, the URLs on this site.)

I had someone come to me earlier this year in a big hurry to alias all their URLs to help their search engine positioning. I took a look through their site and noticed this:

  1. No unique titles
  2. No meta
  3. Horrific HTML
  4. No keywords in header tags
  5. etc.

I explained to them that they had much bigger SEO problems than keywords in their URLs, but they paid me all the same to allow them to alias all their pages. I have no idea if it helped them or not, but I doubt it.

Getting all uptight about URLs in most cases is like me saying, “I want to win the Mr. Olympia bodybuilding championship…so I need to go get a tan right now.” Sure, having a good tan is a small part of being a competitive bodybuilder, but there are probably quite a few other things I should focus on first. Like my abs.

As such, I feel that keywords in your URLs should be #7 or #8 on a 10-point list of the top SEO techniques, not the #1 or #2 spot that people keep putting it in. Again, I’m looking for any solid research on the effectiveness of this technique. If you know of anything, comments are open.

(Irony: mouseover that Amazon link above and check out the URL…)

Gadgetopia
What This Links To
What Links Here

Comments

  1. Hi Deane! Hope your holidays are going well.

    I’ve often wondered why you don’t go with more descriptive urls. Your short urls are nice but I bet you could bump up your visibility another notch with the title in your urls. Not that you need it since you are obviously doing well already.

    I definitely think that having keywords in urls helps, but a Google ego search for Matt Smith has me in the top 30 or so out of 15 million other results. And my name has only been in the title of one post over the past year. I attribute the high showing to other page markup factors (meta, titles, etc.) and it definitely helps having a couple of direct links from Gadgetopia to me.

    I think the best write-up that I have seen is Mike Davidson’s Lessons From The Roundabout SEO Test. Of course, prepare to be let down about the importance of valid HTML and proper semantics.

    Take care!

  2. The domain itself has a lot to do with ranking. Adding brand or generic keywords makes more sense than using acronyms, for instance (if you’re considering seo rather than human recall). Folder and page names have less to do, especially in the case of multiple hyphenated keywords which will quickly be interpreted as spam by Google and friends – using keywords in folder and page names is an easy way to overstuff keywords. If the content (especially that inside headline tags) doesn’t support those keywords, you’re actually hurting your rankings.

    There are definitely people more qualified than me to answer this, but I think the answer/educated guess to the question is “yes, they do help.” But you should focus on content and semantic markup first. And try not to break the camel’s back with too many keywords.

  3. Actually I wanted to do a test on exactly that matter. So about 2 years ago I built a microsite for a product, and instead of using the CMS’s (Typo3) URL rewriting (which I used on all other sites) I left it at the default setting generating pages along the lines of “www.productname.com/index.php?id=42”.

    Despite those “ugly” URL’s, the site developed quite well, generated a lot of sales and still today ranks among the top 5 for its keywords.

    What one needs to consider though is that it is a site for a niche product in a rather small industry (maybe 10.000 prospective customers worldwide). And the product actually had no competition.

    In order to accurately test the effect, one would need TWO sites targeting the same keywords, both pretty similar to each other. But then again you might run into duplicate content issues. So a test is pretty impossible.

    On SEO forums it is believed that the ranking algorithm features several hundred variables. A lot of people believe that the sheer existance of a robots.txt is one of these variables. And – as goes with keyword-rich-URL’s – a rather easy to implement one. So why leave it out, if all it cost you is a configuration switch?

    And finally, we design sites for humans, not for search engines. And a human can much more remember a URL like http://www.example.com/products/widget.html instead of http://www.example-com/shop.php?id=4711

Comments are closed. If you have something you really want to say, email editors@gadgetopia.com and we‘ll get it added for you.