[pmwiki-users] PmWiki namespaces

Eemeli Aro eemeli at gmail.com
Wed Sep 2 04:42:18 CDT 2009


2009/8/31 Petko Yotov <5ko at 5ko.fr>:
> On Monday 31 August 2009 12:08:09 Eemeli Aro wrote:
>> I think that changing some of the
>> fundamentals would in fact be easier, and present a more cohesive
>> whole. There's a lot of potential in page attributes and custom
>> PageStores that I think isn't being exploited as well as it could be.
>
> Thanks for your reply. All this sounds very interesting to me, although still
> abstract. If your customization needs some specific hook or switch in order to
> be enabled from a default PmWiki installation, we'll add it.
>
> I also think that if it is implemented in a good, cohesive, not overly complex
> way and is backwards compatible, it may be considered for inclusion to the
> core. (If it isn't quite compatible, we could consider it for version 2.3 or
> 2.4.)

What I have in mind would probably end up as a part of the core for it
to really make sense. As you say, this is still on a sufficiently
abstract level that I haven't really got to actual implementation
details at all. Here's what I have so far:

PmWiki is made up of a number of different parts, some of which are
now a part of the core pmwiki.php, others are in the scripts/*.php
files. I think we should more clearly separate which parts depend on
the use of PageStore + wiki markup and which parts are really the core
of the engine. For example, (:pagelist:) is the markup that gives you
a list of pages, but there's very little in the code behind that which
makes any assumptions about the pages it's listing.

Much of this is in fact exactly how PmWiki already does things, I'm
just trying to define a paradigm that may make it easier to understand
how the code works and how to add to or modify its components. One
fundamental that I think needs re-defining or generalising is the word
"page", which should not be limited to wiki markup pages.

Each page is a resource that has some number of attributes, and
nothing but attributes. Each page is retrieved from some set or store
of pages, which encapsulates how the page and its attributes are
actually stored. Externally, these attributes aren't or at least
shouldn't be accessed directly, but instead by a mapping that presents
page variables. Many of these variables are just accessor functions
for page attributes, but others give you the values of global
variables, the page URL, and those prefixed with : are defined within
the page contents. For comparison with the current state, the
important thing is the PageVar function, not the $FmtPV and $FmtPTV
arrays.

Internally, the store should also define how to read, edit and access
the history of its contents. This is the part that's different from
the current state, where you have the $HandleActions array defining
these for all pages, independent of source. If the store can define
its own functions for these actions, overloading the defaults, it no
longer matters externally whether the contents are wiki markup in a
PageStore object or a binary GIF file.

I may be abusing the word 'page' a bit, it should probably be
'resource' or something else. Also the definition of 'attributes' as
consisting of the contents as well as the metadata may not be
completely clear.

The changes required in the core for the above are actually rather
small, and mostly related to how you map a string to a resource
(MakePageName) and how you pick a function to handle an action
(HandleDispatch). upload.php will need a more thorough rewrite, but
even that is mostly a matter of creating a PageStore-like class and
moving some of the code under that.

The string to resource mapping should work essentially just as it does
now, with 'Group.Page' looking through $WikiLibDirs for the first
match and resolving to the first in that array if it doesn't find one.
The new thing is that you could target a specific store using
'Store:PageName'. Each store should also be able to prevent
non-prefixed strings from finding matches in it, so you'd need to
always have Attach: for attachments, for example.

The ReadPage and WritePage functions may need a little tuning as well
along with $WikiLibDirs, mostly in order to be able to target a
specific store and to get that information into the $page array. This
could in fact already be done in a very non-intrusive manner -- I'll
add a PITS entry for this.

As for the attachment metadata, I was thinking of adding a directory
'metadata' to each uploads directory that could have a file for each
attachment, using the PageStore format. At least with the above
proposed store system, writing a cookbook recipe that'd do this
shouldn't be too difficult.

>>How should namespaces be identified? With Sisterly I use a prefix
>>"WikiName:" that matches Attach: and Intermap links, as well as how at
>>least MediaWiki handles namespaces, but is there a better way?
>
> First thing coming to my mind is Namespace::Group.Page or
> Namespace//Group.Page.
>
> Also, you may prefer making this separator configurable. I kind of regret that
> currently the Group.Name separator isn't -- and novice users often make
> mistakes. It should be more difficult to link cross-group and cross-field. But
> it is not an easy task to change it (it is not being considered before version
> 2.3 or even 2.4 :-).

It shouldn't be a problem to make this configurable, though I think
I'll at least in my own head keep a single colon as the default
separator -- it's simple, and matches the current Attach: and intermap
syntax, which could be a fallback if some store is discontinued.

> P.S. I have a custom shared.d/ directory for doc/form/XLPage/PTV pages in a
> wikifarm. Pages are writable in one of the fields, read-only in all others.
>  http://www.pmwiki.org/wiki/Cookbook/SharedPages
>
> This may satisfy some of the needs of other people in your situation.

Aye, this is similar to what I have: a single common directory with
template & other default pages that's used by the wikis on the farm. I
have it writable from all, so I don't need to think whether this
particular wiki has its own version of a page that masks the one in
the common dir. That's actually one feature that I really am missing &
hoping to fix here: figuring out where a page came from.

> P.P.S. About this:
>> creating a WikiFarm as easy as adding a few lines of code to a config file
>
> you can mkdirp() different $WorkDir and $LocalDir depending on $HTTP_HOST.
> Then you don't even need to add lines in config files in order to create 999
> new wiki fields...

In my experience, much of the work in setting up a new wiki on a farm
goes into customizing it for its special needs -- if it doesn't have
some special needs it rarely needs its own wiki. However, it's much
more difficult to add a second wiki to a farm than the twenty-second,
and making that first step less steep is what I was talking about.

eemeli



More information about the pmwiki-users mailing list