The concept of page cloaking has come under fire; again, because the idea is being used by a number of legitimate sites in order to protect or hide their content from users and/or search engine bots. The fact that these sites do not get punished for using cloaking techniques has become a sore spot with some bloggers.
Wikipedia black hat search engine spider is different from that presented to the users' IP addresses or the User-Agent script delivers a different version of the search engines so they display the page when it would not otherwise be displayed.
Basically, you are presenting search engine bots with a certain kind of content while delivering different content to the site visitor. Normally, the cloaked pages are created to fool search engines in order to get better result rankings. However, what if you are using cloaking procedures for legitimate reasons like protecting paid content or serving different content based on the visitor's IP address? Should sites doing this be subject to the same penalties? It depends on whom you ask.
On the has some issues with Google seemingly allowing WebmasterWorld to cloak their pages, which goes against the search engine's related to CMS and PHP and a WebmasterWorld post held the first position. However, when Lenssen tried to access the page from the search results, he was taken to a login page - another example of cloaking in action (unfortunately, when I try to duplicate the search, I am taken directly to the content).
Both Lenssen and Graywolf wonder how these otherwise legitimate sites get away with these cloaking exercises when Google and the rest are explicitly against the act. However, the examples given by both bloggers represent the "white-hat" side of cloaking in the sense they are not trying to game the search engines. These sites and companies are merely trying to protect their content.
However, this does not matter to either Lenssen or Graywolf. Because Google has actually addressed this issue in their guidelines, both believe there should be no quarter when it comes to punishing the guilty parties, whether the sites have a legitimate reason for cloaking or not. They also feel Google's Matt Cutts should address the situation so there will be no more confusion.
At the Chicago SES, while it was never explicitly stated (at least in the sessions I attended), there seems to be a growing sentiment that as long as the webmaster isn't trying to be deceptive, search engines will tolerate some cloaking. The Wikipedia page discusses delivering content based on a visitor's IP location (
Found an error or have a suggestion? Let us know and we'll review it.
Murdok. Visit Murdok for the
Suggest a Correction
Cloaking Is Bad... Unless It's Good
0 views
Comments (0)
Please sign in to leave a comment.





No comments yet. Be the first to comment!