[pmwiki-users] search robot indexing

Robin robin at kallisti.net.nz
Wed Jan 26 09:07:39 CST 2005


On Thursday 27 January 2005 03:54, Neil Herber wrote:
> If you are concerned with well behaved robots, such as Google, they all
> follow the robots exclusion protocol which you can implement by simply
> including a text file at your server root. This text file does require
> manual maintenance as groups are added and so on.
The problem with this is that it isn't very flexible. You can't tell it to 
ignore history and edit pages, you can use it to block groups and pages if 
you use UsePathInfo.

> Note that neither of these methods does anything to deflect spammers and
> vandals. They just give you control over how your site is indexed by the
> good guys.
All the spammers I have seen use something like google to find targets. I 
block WikiSandbox with robots.txt, and it makes a difference.

-- 
Robin <robin at kallisti.net.nz>             JabberID: <eythian at jabber.org>

Hostes alienigeni me abduxerunt. Qui annus est?

PGP Key 0xA99CEB6D = 5957 6D23 8B16 EFAB FEF8  7175 14D3 6485 A99C EB6D
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 189 bytes
Desc: not available
Url : /pipermail/pmwiki-users/attachments/20050127/63184462/attachment.bin 


More information about the pmwiki-users mailing list