Cloaking, Getting “Impossible” Sites Ranked

More often than not, cloaking or IP delivery is a must for those companies whose web sites offer very little of what search engine spiders have always preferred ever since they were invented: text content.

Some typical examples:

  • Your web site is made up primarily of dynamically generated content.
  • Your setup consists primarily of catalogue pages with short descriptions or price tags.
  • Beyond this, your content is limited to category headers, product names, etc.
  • Your web site features many documents in PDF format.
  • Your web site is rich in graphics.
  • Your web site features Flash code.
  • Your web site features Real Video.
  • Your web site features audio streams.
  • Your web site features Quickview code.
  • Your web site features Shockwave code.
  • Your web site features Java applets.
  • Your web site navigation is JavaScript based.
  • Your web site requires cookies e.g. to determine which content to display to whom.
  • Your web site is organized via a Content Management System, generating non-standard URLs, visitor tracking strings, session codes, etc.
  • Your web site is run – in part or fully – via binary executables and/or Perl and CGI programs, employing URLs rich in character strings such as question marks, blanks, proprietary path symbols, etc.

In all these cases, search engine spiders have preciously little to go on in their quest to determine what your web site is actually about – and as a webmaster, you for your part don’t have a lot of leeway to inoculate your pages with the keywords and search phrases you are targeting. Considering that you’re in a highly competitive environment with thousands of other sites offering very similar programs, it doesn’t take a genius to figure out that your chances at achieving good-to-excellent search engine rankings are practically zero!

In short: almost everything qualifying for state-of-the-art these days cannot be spidered or indexed efficiently!

Within this scenario, what you will want to do is to set up an industrial-strength cloaking outfit to feed the spiders what they need. You want to offer pages to the crawlbots they will be really happy with: rich in relevant content, with your targeted keywords or search phrases included at a good keyword density, featured in the page titles, in meta tags, site links, etc. This requires the use of phantom pages and Shadow Domains™. So how does it work?

Phantom pages are web pages offering highly optimized content intended for search engine spiders only. Shadow Domains™ are dedicated web properties focused on offering optimized content to search engines while redirecting human visitors to another site, typically a company’s Core or main domain. So, technically speaking, Shadow Domains™ are web sites consisting entirely of cloaked pages not intended for human perusal.

As these pages aren’t intended for human consumption anyway, the actual text used needn’t really make “sense” in any grammatical meaning of the word as long as its semantic content is fairly relevant to your pages’ focus. Thus, for example, you will want to avoid using text highly biased towards an in-depth discussion of web server security or hospital hygiene if you’re actually targeting searchers interested in sports apparel, Nascar replicas or National Hockey League collectibles, to name but a few typical items.

Which, of course, places you squarely between a rock and a hard place: if you had all that content at your fingertips, you wouldn’t have to go for IP delivery in the first place, right? Right! However, there are ways out of this predicament. Indeed, there’s lot of freely available content out there on the Web you can make use of anytime. And 100% legally, too!