[pmwiki-users] Active prevention of crawling by web robots

Patrick R. Michaud pmichaud at pobox.com
Mon Aug 8 13:25:46 CDT 2005


On Mon, Aug 08, 2005 at 07:36:02PM +1200, Simon wrote:
> Was there a reason why robots.txt was not used, or the robots meta tag?

Unfortunately, robots.txt often doesn't provide sufficient control
over robot indexing.  For example, it doesn't have a way to tell
robots to ignore urls that have ?action=edit, ?action=print, etc.
Something like this recipe can give the site administrator more 
control over robot indexing (e.g., per-group or per-page controls).

Pm

> Daniel Scheibler wrote:
> 
> >At
> >
> >http://www.pmwiki.org/wiki/Cookbook/BlockCrawler
> >
> >I describe a recipe to active prevent web crawlers.
> >
> >Greets,
> >
> >scheiby.
> > 
> >
> 
> 
> _______________________________________________
> pmwiki-users mailing list
> pmwiki-users at pmichaud.com
> http://host.pmichaud.com/mailman/listinfo/pmwiki-users
> 




More information about the pmwiki-users mailing list