Search

Google Seething Over Belgian Judgment

0 views

Being compelled to remove certain news sources from the Google index and Google News was not a big deal for the company, but being required to post the judgment on its Belgian homepage apparently touched a nerve with the search advertising company.

Google Seething Over Belgian Judgmentposted some of Google's side of the robots.txt files. "If publishers don't want their websites to appear in search results (most do) the robots.txt standard (something that webmasters understand) enables them to prevent automatically the indexing of their content," Whetstone wrote. "It's nearly universally accepted and honored by all reputable search engines." Robots.txt has been around as a standard for nearly as long as spidering technology has existed for indexing websites. It seems Copiepresse and its affiliated publications could have easily placed robots.txt files on their sites and avoided the need for any legal action. Add to Del.icio.us | Digg | Yahoo! My Web | Furl Bookmark Murdok: David Utter is a staff writer for Murdok covering technology and business.

Suggest a Correction

Found an error or have a suggestion? Let us know and we'll review it.

Share this article

Comments (0)

Please sign in to leave a comment.

No comments yet. Be the first to comment!