[pmwiki-users] Trouble with .pageindex when too much _new_ data to index (+ sqlite)

Petko Yotov 5ko at 5ko.fr
Tue Mar 3 15:02:31 CST 2015


Just to clarify, I didn't suggest using the XMLPageStore recipe (I 
haven't used it yet).

When I wrote about PerGroupSubDirectories I meant the core PmWiki 
functionnality which can be enabled with a line in config.php:

   http://www.pmwiki.org/wiki/Cookbook/PerGroupSubDirectories

About backups, I find it easier to run an incremental nightly backup 
program that downloads only files (pages) added or modified since the 
previous backup, than to download the full huge SQLite database file.

Petko

On 2015-03-03 13:44, ABClf wrote:
> Main reason I'm playing with sqlite recipe is because I have a lot of 
> data
> I wish to publish using pmwiki rather than another program.
> I have Words (70k) and Quotes (100k), and Sources (3k).
> Words group is the most important ; every Word should have a pagelist
> getting and formatting the linked Quotes.
> 
> Posting separatly each data in a dedicated (short or very short) store
> would be very handy, with a pair of links for connecting the data, 
> although
> it would make a large amount of pages. Acceptable for database, but not
> easy to handle (back up, massive search and replace, etc.) for text 
> files.
> The huge pageindex might be an issue as well. Not sure the pagelist 
> would
> still work.
> (Last tests with sqlite were not positive at the end, reaching the 
> limit,
> yet I might have better result with text files).
> 
> Then I'm going to play now with txt files (using xml page store 
> recipe).
> 
> As for PerGroupSubdirectories, no known issue to set up something like 
> that
> (up to about 12000 files max in a folder) ? :
> 
> # XML PageStore
> $EnablePageStoreXML = 1;
> include_once('cookbook/XMLPageStore.php');
> $WikiDir = new XMLPageStore('wiki.d/{$Group}/{$Name1}/{$FullName}');
> 
> Well, no, I dont want to rely on low used recipes for main task.
> 
> Thank you,
> Gilles.
> 
> 
> 
> 
> 2015-03-01 0:51 GMT+01:00 Petko Yotov <5ko at 5ko.fr>:
> 
>> The error you describe comes from the SQLite recipe and could appear 
>> if
>> the database cannot correctly return the page records, which can 
>> possibly
>> happen when there is too much data or not enough time/memory.
>> 
>> Is there any particular reason to use the still experimental SQLite
>> PageStore class, largely untested for large volume or large traffic 
>> wikis?
>> Current versions of the filesystems are quite fast and, with a recipe 
>> like
>> PerGroupSubdirectories, PmWiki should be able to handle a large number 
>> of
>> small pages.
>> 
>> Probably even on a cheap shared hosting. Or, if you require more 
>> dedicated
>> power, but cannot configure, secure and maintain a VPS/DS, there are
>> "Performance" shared hosting plans (http://www.ovh.com/fr/
>> hebergement-web/offres-performance.xml or http://www.1and1.fr/
>> hebergement-web?__lf=Static&linkOrigin=serveur-infogere&
>> linkId=hd.subnav.hosting.hebergement-web for example).
>> 
>> You appear to import your pages from an existing database. It may be
>> possible to use some of the recipes listed here:
>>   http://www.pmwiki.org/wiki/Cookbook/CompareDatabaseRecipes
>> 
>> Petko
>> 
>> On 2015-02-27 22:38, ABClf wrote:
>> 
>>> i'm still (desesperatly) fighting with pmwiki
>>> last try was to post new pmwiki, sqlite recipe, setup config, and 
>>> post
>>> sqlite database (100mo) somewhere on my website (1&1 shared host) ; 
>>> then
>>> ask for Site.Reindex page.
>>> 
>>> I'm getting this sqlite recipe related error at the very beginning 
>>> (just
>>> after count was done ; nothing created in wiki.d) :
>>> 
>>> Fatal error: Call to a member function fetchAll() on a non-object in
>>> /homepages/18/d269604285/htdocs/dev4/cookbook/sqlite.php on line 403
>>> 
>> 
>> 
>> _______________________________________________
>> pmwiki-users mailing list
>> pmwiki-users at pmichaud.com
>> http://www.pmichaud.com/mailman/listinfo/pmwiki-users
>> 



More information about the pmwiki-users mailing list