[pmwiki-users] How much data can a wiki page take? (Was: Bibliographies)

Bernd Wiemann wiemann at ddz.uni-duesseldorf.de
Tue Sep 12 09:27:05 CDT 2006


Hallo Patrick,

is there a possibility to prevent the pageindex-file to index the whole
page? Something like $pageindex=20000 or action?=search.summary?

My wiki on localhost has about 3000 pages of articles and stories.
60 Mbyte Data + 10 MByte for the pageindex.

Because running out of time when searching (the first time, but
localhost has to many first times...) let me think about saving
hugh pages as internal pdf's

Thanks.
Bernd

Patrick R. Michaud schrieb:
> On Tue, Sep 12, 2006 at 10:39:54AM +0200, christian.ridderstrom at gmail.com wrote:
>> On Tue, 12 Sep 2006, John Rankin wrote:
>>> On Saturday, 9 September 2006 11:16 PM, christian.ridderstrom at gmail.com wrote:
>>>> It might be possible, but it is probably not so simple to do it exactly 
>>>> like that online. IIRC, it has incremental search for finding the 
>>>> references, and a database can get quite big.. I've seen BibTeX files that 
>>>> are a few 100kB.
>>>>
>>> Do you think a few 100kB bib file will be a problem if it's stored as a
>>> wiki page? 
>> I don't know... let's ask Patrick! Patrick?
>>
>> My guess is however that as long as you don't want to show it very often 
>> it is not a problem.
> 
> PmWiki itself doesn't have a problem with very long pages other than
> the time it takes to render them (which _can_ be a problem on
> slow servers).   At http://www.pmwiki.org/wiki/Test/LargePage
> there's a page that has over 350K of text in it.
> 
> Pm
> 
> _______________________________________________
> pmwiki-users mailing list
> pmwiki-users at pmichaud.com
> http://www.pmichaud.com/mailman/listinfo/pmwiki-users
> 
> 




More information about the pmwiki-users mailing list