[pmwiki-users] Compat1x, enormous pages, and memory limits

Donald Gordon don at dis.org.nz
Tue Oct 3 18:18:12 CDT 2006


Hi

I'm trying to convert a PmWiki v1 wiki to a v2 wiki using compat1x.php,
but have run into a snag.  Enormous pages cause PHP to run out of
memory when converting the markup.

My quick fix, to replace the single call to preg_replace in
compat1x.php with a loop, seems to greatly help matters.  The
PageStore1x class with this change added is shown below:

class PageStore1x extends PageStore {
  function read($pagename) {
    global $Compat1x,$KeepToken;
    $page = parent::read($pagename);
    if ($page) {
      $page['text'] = preg_replace('/(\\[([=@]).*?\\2\\])/se',"Keep(PSS('$1'))",
        @$page['text']);
      foreach ($Compat1x as $pattern => $replacement)
	$page['text'] = preg_replace($pattern, $replacement, $page['text']);
      $page['text'] = preg_replace("/$KeepToken(\\d.*?)$KeepToken/e",
        '$GLOBALS[\'KPV\'][\'$1\']',$page['text']);
    }
    return $page;
  }
}

As far as I can tell, when performing a preg_replace on arrays, PHP4
does no memory reclaimation.  With this change (and the disabling of
one large extension), our out of memory errors go away.

Any chance of this change getting into the release?

thanks

donald
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 191 bytes
Desc: not available
Url : /pipermail/pmwiki-users/attachments/20061004/1c58d7d0/attachment.bin 


More information about the pmwiki-users mailing list