Frequently Asked Questions
Q: Is cloaking today as relevant as it was 5 years ago? Do web 2.0 sites and other easy link sources & hosts still make it quite profitable? How has cloaking changed over the years?
A: Like all things search, cloaking has changed in the course of the years. Initially, it was sufficient to simply cloak single pages on your site, giving you a mix of cloaked and open pages. Then, it was more about foregoing risks for your money sites plus enhanced scalability by deploying self-contained, independent cloaked sites – Shadow Domains(TM) -, effectively restricting your cloaking efforts to these SDs which could be discarded and easily replaced by fresh ones should they be caught out by the search engines.
Today, cloaking has evolved to both include and target RSS feeds, promoting them via the aggregators and feed directories, and we offer new functionality enhancing page structure variance, inclusion of graphics, CSS, etc. to make the SDs appear even more organic to the spiders. Finally, we also offer a vastly improved text generation systems as well.
Until recently, cloaking generally only addressed on site factors, optimizing webpages for the search engine spiders. What it didn’t do per se is attend to off site stuff such as link building. So now once you’ve started to roll out your SDs, you’ll still have to throw a decent amount of good links at them to make their rankings stick. However, this isn’t a change in technology so much as in SEO strategy: once links became more all-important, you had to add link building to your arsenal of SEO techniques just like everyone else.
Is it still relevant i.e. effective? Most definitely – provided you know what you’re doing by running a tight ship strategy wise. Essentially, this is nothing new: it simply comes with changing search engine algos, new platforms (such as blogs or social bookmarking sites etc.).
As for Web 2.0 sites, we’re mainly leveraging them for both link building. It’s actually quite easy to promote cloaked sites or pages via the social networking platforms these days because people have become so well accustomed to being redirected when browsing the Web that it doesn’t tend to raise any eyebrows anymore.
Q: Some well funded web 2.0 sites do things like list “relevant keywords” and “keywords sending traffic to this page”… what is the difference between cloaking and such an automated approach to keyword rich content generation? Why is one considered bad with the other being considered fine?
A: Well, cloaking or IP delivery in the technical sense is, of course, about displaying different content to search engine spiders than to human visitors. What these Web 2.0 sites are actually doing is going for the old worn keyword stuffing technique, not cloaking proper. (Well, not as a rule, anyway.)
It’s actually quite funny to see well-trafficked sites like that adopt an amateurish level of purported search engine optimization which we, as professional SEOs, have long demoted as no longer effective enough. There are many plausible explanations for this, though in the main it’s probably all about fundamental cluelessness. But because these sites are getting tons of traffic from other sources than organic search, and in view of the fact that the search engines are concerned about losing large chunks of their traffic and search market shares (think Facebook and Twitter for two prime examples), they seem to be giving them an unabashed preferential treatment which no ordinary mom-and-pop web site can ever hope to be blessed with.
To the uninformed, this may actually seem to endorse such dated SEO techniques though this is an entirely false conclusion. Because it’s actually not the keyword and link stuffing at all that helps these sites achieve to high rankings, PageRank etc. – rather, it’s all those other factors your run-of-the-mill site cannot easily emulate.
On the client front, we’re experiencing a lot more openness towards “black hat” SEO such as cloaking etc. than e.g. 3-4 years ago. Generally, people aren’t as impressed or as easily conned by the search engines’ (especially Google’s) FUD tactics regarding anything they don’t like. Sure, they’re worried about possibly losing their sites in the search engine indices, but the number of people who’ll simply swallow everything Google feeds them by way of their peculiar gospel of what a “good boy or girl” should do or refrain from in terms of SEO is positively on the decrease.
Q: As Google pushed nofollow and became more liberal with the “black hat” label it seems there is less discussion about black hat vs white hat. Do you agree with that? And if so, why has that conversation died down?
A: We think it’s because people are getting more pragmatic about things. Maybe it’s the novelty of doing business on the Web which has worn off, maybe it’s the vast variety of divergent opinions and schools of thought of SEO and the unprecedented exposure the importance of organic search engine optimization is enjoying in the media.
Whatever it may actually be, we agree that the debate has become de-emotionalized, less religious even. When we started off with formal SEO services back in the late nineties, the debate was all about “ethical” versus “unethical” SEO. Lots of gut level reactions then to what was, after all, merely a technological, not a theological or moral issue. Add to that the increasingly competitive environment people have to cope with on the Internet and it all figures rather nicely. You might arguably say that Web commerce as a whole has matured, as, of course, has the SEO industry proper.
These days, when you speak with clients they won’t flinch one bit if you ask them whether they want to opt for a “white hat” or a “black hat” approach. Rather, they’ll inquire about efficacy, the relative risks and so on. So it’s a pretty much unexcited, hands-on discussion, which is a very good thing.
Q: Google’s Matt Cutts often tries to equate search engine manipulators with criminals. And yet the same search results will sell exposure to virtually anyone willing to pay for it. From a linguistic and framing standpoint, what gives Google such dominance over the SEO conversation?
A: We’ve recently dubbed Matt Cutts as Google’s “FUD Czar” for this very reason, not that we expect it will stop him from pursuing that course in future. Next thing we may find him equating black hat SEOs with kiddie porn peddlers, Columbian drug cartels and white slavery racketeers…
We find this a fairly worrying though certainly not an unexpected development. It’s an established scare tactics we’ve seen deployed ever and again in human history: lump your detractors with any which foes everyone is concerned about to make all that muck rub off. It’s how witch hunts and, in the political field, totalitarian propaganda, especially the fascist kind, has always been conducted.
We know we may get quite a bit of flak for this, but the way we view things Google as a corporation has subscribed to an essentially totalitarian mindset. It’s quite clear for anyone to see: in their public statements, in the way they tend to react to criticism, and of course, even more importantly, in the vast array of technologies and data conduits they’re rolling out to dominate all the time.
This being the Information Age, information is equated with power – this is a pervasive meme that’s dominated Western culture for centuries if not millennia. And this is precisely what Google is trying to monopolize – alas, quite successfully.
But not to worry, we won’t set out on a rant with a long-winded academic analysis of Google’s crypto fascist ideology and praxis here. Suffice it to say that we’ve studied these matters in some depth for many years now. This isn’t about some whacko conspiracy theory; it’s about cold, hard nosed and sober analysis and evaluation of verifiable facts.
Q: Why is it that Google thinks highly of public relations (even if founded on lies) but thinks poorly of most other bulk link building strategies?
A: A search engine’s primary objective is NOT to “delivery relevance” as so many people are fond of fooling themselves and others, it’s to make a profit, period. Verbatim: “A search engine’s primary purpose is NOT to deliver relevancy. A search engine’s primary purpose is to deliver revenue. That is not the same thing.”
While many SEOs still seem to find it hard to come to terms with that, it’s pretty obvious that the folks over at Google were pretty slow to learn that lesson themselves. Oh, they certainly did so in the end, and with a vengeance, too. But along with this came all the other trimmings that will make or unmake just about any commercial enterprise, an ingrained preference for low pay being compensated with lots of feel good high talk for the suckers included.
Of course, hypocrisy plays a major role in this field as well: just like “spam” is always what the other guy is doing, not you yourself, “public relations” is always ok for Google if it helps you ramp up your company to potential client status. At the end of the day you’ll have to conduct a lot of public relations to be able to afford some serious Adwords advertising – simple as that. So it makes no sense killing the cows you actually want to milk further down the road.
By contrast, however, undetected paid links will negatively impact Google’s fundamental business platform because they can’t really deal with them effectively, being so very link biased as they are (or used to be) – so they’re bound to be slated as a big no-no from their point of view.
None of this is illogical in any way – but of course that doesn’t mean that we as SEOs have got to like or condone it. I know for sure that I don’t…
Q: There has been a big upsurge in widespread “blackhat operations”, mainly coming about from peoples white hat losses. What do you make of it all, have Google screwed themselves over by opening the can of worms and where do you see it ending, this is a war isn’t it?
A: Yes, it is decidedly a war and what’s more, it is escalating all the time. That’s not just our own, arguably biased view at Fantomaster Inc., it’s what that whole “surge of black hat operations” you’re rightly referring to proves beyond doubt by its very occurrence. Because it’s quite true, and our sales figures are reflecting it beyond any reasonable doubt as well: Ever more web marketers are beginning to realize that they don’t stand a snowball’s chance in hell SEM wise without resorting to cloaking in the current environment if they’re targeting even an only moderately competitive market.
While I’m not exactly famous for schmoozing with the likes of Google, in this case I’d be reluctant to place the blame squarely at their doorstep, however. Of course the engines are involved in this all, but in my view the fundamental issue is really one of dramatically exploding global competition. There’s simply too many people vying for a piece of what’s essentially the same finite, limited cake which makes for a veritable no holds barred cut-throat environment quite unprecedented in human history. And of course it’s also a result of the widely increased penetration of Internet technology – think broadband and mobile phones here, for example – this has resulted in a whole lot more technological awareness hitting the market now as well: Gone are the days when people could successfully brag at parties about “not needing all that weird new fangled Internet stuff” …
Add the fact that generating automated content on the fly has become both viable and affordable for just about everyone setting out to run a business on the Web, and you’ll easily see that the whole balloon is expanding almost at the speed of light.
The search engines are merely trying to come to terms with all this immense amount data which is actually growing exponentially all the time.
Q: If a person gets to a Product X page when they looked for Product X, why do you think Google has such a big issue with Cloaking?
A: Control – they’re control freaks, that’s all. Not that it’s very logical for them to be: After all, it’s definitely not their own content they’re making a living off, still they want each and every webmaster to play by their rules and their rules alone, period.
Now we guess it won’t come as a particular surprise to anyone that we at Fantomaster nurture an entirely different view. If it’s my web site, if it’s my content, then it’s my own bloody business what we choose to do with it. If they don’t like what we do, they’ll chuck me out of their index, provided they find out about it. We’re ok with that, too, but when all is said and done it’s an epic contest of minds – actually, all of SEO is, not just cloaking or “black hat” techniques.
We’ve been long enough in this business to remember the days when the search engines (or, rather, their reps) were wont to treat each and every SEO as mere scum, as dirty spoilsports who’d best vanish from the face of the earth, the sooner the better. Of course, that was a pretty self-serving and hypocritical agenda and we’ll concede that they have backed down a bit on this score which, interestingly enough, came about roughly with the advent of PPC, when what would later become Overture was still making it up as they went alon in their incarnation as GoTo.com.
On the other hand, it’s not as if the engine were really clamping down that hard in earnest on cloaking as they have always pretended to do. To some extent they’ll even tolerate it if it’s not too blatantly misleading, as in your example, or too spammy. It’s common wisdom amongst the more experienced and sophisticated SEOs that probably 80-90% of all Fortune 1000 companies are making use of cloaking one way or another these days. Ok, so if they do get caught out due to some stupid glitch, as was the case with BMW not very long ago, it’ll create a bit of a stink. But overall –
where would it leave the search engines if they actually banned all those big boys from their indices? Fat chance of that ever happening.
So while the risks of cloaking are quite real, in the same stride they’re actually pretty remote provided you play your cards right – and do it well, of course.
Q: Fantomaster been a long-time proponent of cloaking. The search engines, Google in particular, have come out strongly against the use of such techniques. Can you talk a little about this?
A: Well, for one thing the search engines themselves are the Web’s #1 cloakers. If you’re located in Belgium and enter “Google.com” in your browser, only to be redirected, like it or not, to Google.be, that’s cloaking. If your site serves different pages to different browsers, that’s cloaking. If you serve French content to surfers logging in via a French IP, while presenting Dutch IPs with Dutch pages, that’s cloaking, too.
So, a lot of cloaking is actually simply about customization and personalization – nothing “bad” about it at all, quite the contrary.
Let’s face it: search engine spiders are a very dumb lot and despite some interesting exotic indexing models being flaunted occasionally here or there, there’s no realistic indication that this sorry state of affairs will be overcome in any conceivable time frame.
The very same applies to most content management systems: while they may help you save lots of time and effort in running a highly informative web site, when it comes to efficient search engine optimization you might as well dump it down the drain and send me your money instead. But seriously – search engines are just not up to what current Web technology has to offer and is actually featuring all over the place.
That’s mainly because, regardless of much they may hype their own setups, search engines are simply not state-of-the-art. In fact, you will stand the very best chances of achieving decent rankings only if you stick to HTML version 1.2, avoiding all frames, forgetting about animated graphics, and so on, and so forth. Or, possibly, if you subscribe to a minimalist, purely text based web design and layout philosophy. All this may be just fine for a whole lot of web sites, however, for very many corporations, for whole industries even, this approach is simply not feasible and does not constitute a viable solution by any standard. How are you going to keep Flash and Shockwave streams from a gaming or a video site, for example? What, if you’re in the graphics business, or if your news portal software requires URLs with tons of weird characters and session codes in them on which the spiders will simply choke? And why, for heaven’s sake, should you make your site design and layout the slave to technologically challenged search engine spiders in the first place?
So this is where cloaking or, rather, IP delivery comes in. To recapitulate: IP delivery is a technology that serves different content to search engine spiders and to human visitors, based on visitors’ (human or otherwise) IP address. This requires special software (such as the technology we have developed, hint, hint!) to determine who is who or what. The only truly reliable way to do this is by knowing spiders’ IP addresses, so what your software will also require is a comprehensive database of verified search engine spiders. That’s why Fantomaster developed its own fantomas spiderSpy™ service which happens to be the world’s most comprehensive, with thousands of spiders referenced, updated every six hours.
The advantage of IP delivery is that your don’t have to touch or tweak your main domain’s source code in any way. As no human visitors will get to see your cloaked content (what we term “phantom pages”) anyway, gone are all worries regarding site design, graphics overload, browser compatibility, conflict of interest between aesthetics and search engine spiders’ requirements, and a lot of other time wasting issues pulling your resources off your real task, namely driving decent, qualified traffic to your site. You can now optimize those phantom pages for better search engine rankings at your own discretion, and no one will be the wiser.
So can cloaking be abused? Sure it can! But so are kitchen knives and painkillers. We for our part have never advocated misleading search engine optimization, if only because it’s dumb marketing: if you find a site offering second hand books in a search engine, what are you going to do if you’re redirected to a porn site instead? You’re going to get annoyed with the porn site, right – and it’s really as simple as that. Will you buy porn stuff there, even if you were into that sort of thing? Most probably not. There’s no excuse in the world for misleading surfers like that and it certainly doesn’t seem to pay off either, which is why we’re actually seeing less and less of that sort of thing these days, and we can’t say we’re too unhappy about it.
But let’s face realities here: while the search engines may take a strong-arm stance against cloaking in public, they don’t really seem to worry too much about it in everyday life. One of the reasons being that there’s so much legitimate cloaking about, it would simply be impossible to weed it all out. Else, you might well expect the world’s top 1000 web properties to disappear from the search engine indices, and where would that leave them, loss of advertising revenue apart?
It’s quite important to realize this fact before fretting about the possible penalization of cloaking, as so many clueless SEOs are wont to, scaring the horses left, right and center without a single tangible proof of what they’re claiming to know absolutely everything about.
Is it possible to get banned or penalized for cloaking? Yes, it is. But is it likely? Hardly – the search engines are far too busy trying to eke some money out of their services. Maybe that’s why they are hardly investing anything in advancing search technology to make tricks like cloaking obsolete. In any case, you can neatly avoid penalization by working with what we call “Shadow Domains”, i.e. domains dedicated to giving the search engines appropriate spider fodder while redirecting human visitors at system level, without delay, to the main domain proper. That way, should you really ever be penalized for cloaking – again, an extremely rare occurrence, all you will normally lose is that particular
Shadow Domain. But then, all we do is register a new one for you and start from scratch. Simple as that.
Used responsibly, cloaking will actually give everyone concerned the best of all worlds: search engines will become aware of sites, albeit indirectly, their spiders are unable to crawl properly because of their antediluvian technology. Webmasters in turn will be happy with their rankings. And most importantly, surfers will gain access to web sites they would, to all probability, otherwise never know about.
We are aware that this may sound a bit optimistic, but let’s for a change face the fact that it’s actually the search engine spiders, not the users, that are dumb – users can always, and actually will, vote with their mice on whether your site’s high ranking was justified and relevant to their search. By contrast, most if not all search engines seem to be run by people who subscribe to a patriarchal, hierarchic view of human nature, claiming competence on what users are supposed to see and what not. They’re control freaks, and that’s what the whole issue of cloaking actually boils down to: the search engines, parasites that they basically are, living off other people’s (the webmasters’) labor, want to retain control over everything including web design and layout. Little wonder that an ever growing number of webmasters and corporations are beginning to resent just that …
The SEM world exists in a strange place where the relationship between the search engine and the SEM is not clearly defined. How do you think the search engines feel about those who practice SEM and do you ever see a point in time where both sides will see eye to eye?
The way we see it, there’s a long, ignominious tradition of search engines regarding – and actually treating – SEMs as mere trash. Here they are with all that purportedly wonderful technology of theirs, with their brilliant ranking algorithms and their chauvinist “my database is bigger than your database” attitude – and along come we SEMs spoiling the show. We have yet to meet a single search engine representative acknowledging in public that search engine optimizers and marketers may actually have something good to contribute to the overall Web ecology.
You can see it ever and again at search engine conferences all those search engine reps admonishing the participants not to do this, not to do that, to stick to this, to only do that, “be a good boy/girl across the board, do what daddy tells you, and we just might be a wee bit nice to you”, wagging their index fingers and threatening SEMs with dirty looks – just like a bloomin’ nursery! And here are all these adult people, webmasters and marketing officers alike, gobbling it all up in awe like gospel. More often than not, it’s a pretty pathetic spectacle.
Certainly we would endorse a more constructive relationship between both parties, one which would actually help both sides serve the Web community best. We don’t know of any SEM who wouldn’t. But is it likely to happen within the foreseeable future? Hardly, we’re afraid.
Q: Some hard-line SEOs denounce any technique they see as being outside the published SE terms of service as “spam”. What is your take on so-called “ethical seo”?
A: There are those who would say that ‘ethics’ is just a cloaked form of hypocrisy.
But seriously – there’s a pervading myth in the search engine marketing and optimization industry that if you’re a good boy, the engines will pat your head and will reward you with fine rankings, even if it may take an incarnation or two. That’s unfortunate because not only does it fuzz up the hardcore technological issues involved, it also attracts all sorts of gut level thinkers to the SEM world, flogging their gut level advice (“content is king” being just one pervasive popular myth in question) and confusing each other and everybody else. This is a basically religious, moralistic attitude, and quite an inadequate one when dealing with technological issues.
A more rational approach would certainly seem in order here. We’ve talked about abusing cloaking already. Don’t do it! No, it won’t make you end up in hell, but it will irritate your visitors. Meaning that they will take their business elsewhere, period. So the search engines are devoting a lot of energy in setting up rules of conduct, fine. This may be a sensible thing to do, at least from their point of view. But don’t expect them to spill the beans on what they are actually doing. Google won’t tell you exactly how it determines rankings. Again, this is fine – it’s their game after all, so why shouldn’t they try to call the tune.
But if you set out to use search engine generated traffic for your business model, you ought to realize that there’s a generic conflict of interest being installed: you may want good rankings to achieve good returns, while the search engines couldn’t care less about your revenue. All they ever want your content for is to expand their database to become more attractive to surfers. It’s a number game, and as an individual webmaster you’re always being shortchanged: if your business goes belly up, the search engines will simply feature someone else on their SERPs without wasting one thought on you. After all, they have billions of other pages to choose from.
Ethical behavior only makes sense amongst equals. So, as a webmaster, are you really an equal in the search engines’ view? No, you aren’t – the odds are stacked solidly against you, and that’s where the fun starts. In fact, as long as webmasters and search engines cannot agree by mutual consent on a rigorously enforced code of standards, worrying about the ethics possibly involved is a mere pastime for self-appointed prophets who love aggrandizing themselves by self-righteously sermonizing others from their pulpits. And yes, this may include many a search engine representative, too!