It's recently come to my attention that a large percentage of the Slaptijack site has ended up in the dreaded "Google Supplemental Index". This means that key pages are not being served during normal searches. You can see which pages are in the Supplemental Index by running the following search on Google's site.

site:slaptijack.com *** -view

Running the above query shows that 80 of the 100 pages Google has indexed are in the Supplemental Index. Since this index is only rarely used, that means that 80% of the SlaptiGoodness™ is not being served immediately to the hungering public. So, in an effort to cut down on what is getting pushed into the Google Supplemental Index, I've made two big changes.

  1. Implement a highly-modified robots.txt.
    I used many of the suggestions posted by Nathan at Not So Boring Life to cut down on the number of duplicate entries the Googlebot is able to find. Since duplicate content is an easy way to get dropped into the Google Supplemental Index, it's possible that the easy access to this site's content has actually had a negative impact!
  2. Only use one category per post.
    I think this is where we are going to run into the most trouble. Previously, I had been posts into the categories that seemed most relevant. At times, this meant that some posts might be listed under Networking AND System Administration. Unfortunately, since I use categories in my permalinks, this created two different URLs for the same post. Duplicate content!
    So, I've put all posts into one category. This means that some links might be dead.

Hopefully, all this work will get most of the site out of the Google Supplemental Index and into the main indexes. If you see any problems with the site, I encourage you to contact me via the email address listed in About Slaptijack.

Related Reading: