Practicing Common Sense SEO

There are lots and lots and lots of SEO advices that come in different forms and published for different reasons. Some fronts see SEO as an industry bound to fail as it continues to confront the underlying secrets of search engine algorithms without ever guaranteeing results.

With such claims, people new to the industry are getting confused. Once they read about link farming, buying keyword-rich domain names and other what I call desperate measures, they become potential contributors to the filth the web has brought us (in addition to spam and viruses).

Despite all the hype that surround the hot topic of search engine optimization, I still stick to common sense and say no to rule books (length must be…, density must be…) and say yes to guidelines (here, here and here).

I am even unsure if I should ever use ethical as opposed to common sense. OK, I made a compromise by using the title of the article as the latter and the URL structure with the former.

I compiled items I think are important in optimizing our pages:

  • Content. Content gives search engines idea of what your page is all about. Even if they aren’t as sophisticated as some AI robots (yet), major search engines do have much improved semantic and lexical analysis. They know what are necessary terms (though it made me think why should they remove if the stop words should be there?). They determine the plural forms, tenses and other stemming procedures. So proper content grammar, spelling and sufficient proofreading should be done to ensure search engines “understand” the message we want to convey. In that case, they can place our page appropriately on search results once a relevant query is entered.
  • Navigation. Having a sound navigation scheme provides both human and search engine visitors accessible passageways to examine and peruse the entire site without asking for keys (passwords) to get in to certain rooms (pages). If we fancy drop down menus instead of the more generic left-hand ones (as seen in this page), CSS-based menus such as suckerfish dropdowns are a good choice. Avoid JavaScript menus where possible. Why? Many of them do not use the anchor tag <a> used by search engine robots to follow links. I believe search engines have other things to do before accommodating these type of linking methods.
  • Meta and Title Tags. Each page MUST have a unique page title simply because it’s one of the first things noticed by search engines, similar to the bigger sized article title. I think shorter titles work better for specific queries because of keyword prominence. Spending time doing meta description still pays dividends, even if sometimes search results snippets display data culled from meta description and body of the page. Forget about meta keywords. I use them mainly to identify what keywords a particular page has been optimized for.
  • Internal Linking and URL Canonicalization. Ideally all pages in a site has to be linked with each other to expedite the user experience of getting to a page in just one click. Ditto for search engine robots without having to pass through only one path (say, the Site Map). Search engines have to crawl the page (scan the contents: text, alt text, image attributes, etc) and index them (placing them in their database for future page display) before they get ranked so it is important to ask yourself first if your page is indexed (using the site: tool) before asking why is it not ranked. Implementing drop down menus could address this issue (see Navigation). Canonicalization ensures that search engines only see one URL displaying a specific page. Since and are two URLs displaying the very same page, these two could end up competing against each other. I always use the domain URL when referencing the homepage.
  • External Linking. Pages linking to our site give a plus point in the eyes of search engines especially Google because it is generally thought that our page is of good quality and must be linked. (This is where link spamming came to exist) The more links point to our pages, the more popular we become and the more channels of generating visitors mean the site is exposed to a wider audience, provided of course that these pages that link to our site receive a considerable number of visitors too.

Things to Avoid

  • Hidden Text and Hidden Links. Are we out of our minds? Why are we putting hidden text (white text on white background) and links if our human visitors cannot see or click them?
  • 1×1 GIF with Tons of Alt Text. Meaningless, senseless, a waste of time and an effective ban invite.
  • Cloaking. Stop masquerading. Be yourself. Show your stuff.
  • Duplicate Content. Will not get banned but duplicate pages are ignored anyway.
  • Link Farming. May help generate traffic but gets no love from relevance-conscious search results.
  • Wrong Redirection. Using META Redirect, JavaScript (is JavaScript a bane to SEO?) is interpreted as we are manipulating which pages a visitor must see. Better use 301 Redirects.

Stay honest. Do not resort on any method that you think will not benefit the human visitors or appear to deceive search engines. Good SEO is an honest SEO. Otherwise, we might end up in the search engine dumpster for a while as we repent and ask for reinclusion.

Do not rely too much on what you read. Apply common sense and try some pieces of advice. The more you test, the more you realize that a lot of people write SEO advice they do not apply at all.