1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
|
I like the idea of [[tips/integrated_issue_tracking_with_ikiwiki]], and I do so on several wikis. However, as far as I can tell, ikiwiki has no functionality which can represent dependencies between bugs and allow pagespecs to select based on dependencies. For instance, I can't write a pagespec which selects all bugs with no dependencies on bugs not marked as done. --[[JoshTriplett]]
> I started having a think about this. I'm going to start with the idea that expanding
> the pagespec syntax is the way to attack this. It seems that any pagespec that is going
> to represent "all bugs with no dependencies on bugs not marked as done" is going to
> need some way to represent "bugs not marked as done" as a collection of pages, and
> then represent "bugs which do not link to pages in the previous collection".
>
> One way to do this would be to introduce variables into the pagespec, along with
> universal and/or existential [[!wikipedia Quantification]]. That looks quite complex.
>
>> I thought about this briefly, and got about that far.. glad you got
>> further. :-) --[[Joey]]
>> Or, one [[!taglink could_also_refer|pagespec_in_DL_style]] to the language of [[!wikipedia description logics]]: their formulas actually define classes of objects through quantified relations to other classes. --Ivan Z.
>
> Another option would be go with a more functional syntax. The concept here would
> be to allow a pagespec to appear in a 'pagespec function' anywhere a page can. e.g.
> I could pass a pagespec to `link()` and that would return true if there is a link to any
> page matching the pagespec. This makes the variables and existential quantification
> implicit. It would allow the example requested above:
>
>> `bugs/* and !*/Discussion and !link(bugs/* and !*/Discussion and !link(done))`
>
> Unfortunately, this is also going to make the pagespec parsing more complex because
> we now need to parse nested sets of parentheses to know when the nested pagespec
> ends, and that isn't a regular language (we can't use regular expression matching for
> easy parsing).
>
>> Also, it may cause ambiguities with page names that contain parens
>> (though some such ambigutities already exist with the pagespec syntax).
>
> One simplification of that would be to introduce some pagespec [[shortcuts]]. We could
> then allow pagespec functions to take either pages, or named pagespec shortcuts. The
> pagespec shortcuts would just be listed on a special page, like current [[shortcuts]].
> (It would probably be a good idea to require that shortcuts on that page can only refer
> to named pagespecs higher up that page than themselves. That would stop some
> looping issues...) These shortcuts would be used as follows: when trying to match
> a page (without globs) you look to see if the page exists. If it does then you have a
> match. If it doesn't, then you look to see if a similarly named pagespec shortcut
> exists. If it does, then you check that pagespec recursively to see if you have a match.
> The ordering requirement on named pagespecs stops infinite recursion.
>
> Does that seem like a reasonable first approach?
>
> -- [[Will]]
>> Having a separate page for the shortcuts feels unwieldly.. perhaps
>> instead the shortcut could be defined earlier in the scope of the same
>> pagespec that uses it?
>>
>> Example: `define(~bugs, bugs/* and !*/Discussion) and define(~openbugs, ~bugs and !link(done)) and ~openbugs and !link(~openbugs)`
>>> That could work. parens are only ever nested 1 deep in that grammar so it is regular and the current parsing would be ok.
>> Note that I made the "~" explicit, not implicit, so it could be left out. In the case of ambiguity between
>> a definition and a page name, the definition would win.
>>> That was my initial thought too :), but when implementing it I decided that requiring the ~ made things easier. I'll probably require the ~ for the first pass at least.
>> So, equivilant example: `define(bugs, bugs/* and !*/Discussion) and define(openbugs, bugs and !link(done)) and openbugs and !link(openbugs)`
>>
>> Re recursion, it is avoided.. but building a pagespec that is O(N^X) where N is the
>> number of pages in the wiki is not avoided. Probably need to add DOS prevention.
>> --[[Joey]]
>>> If you memoize the outcomes of the named pagespecs you can make in O(N.X), no?
>>> -- [[Will]]
>>>> Yeah, guess that'd work. :-)
> <a id="another_kind_of_links" />One quick further thought. All the above discussion assumes that 'dependency' is the
> same as 'links to', which is not really true. For example, you'd like to be able to say
> "This bug does not depend upon [ [ link to other bug ] ]" and not have a dependency.
> Without having different types of links, I don't see how this would be possible.
>
> -- [[Will]]
>> I saw that this issue is targeted at by the work on [[structured page data#another_kind_of_links]]. --Ivan Z.
Okie - I've had a quick attempt at this. Initial patch attached. This one doesn't quite work.
And there is still a lot of debugging stuff in there.
At the moment I've added a new preprocessor plugin, `definepagespec`, which is like
shortcut for pagespecs. To reference a named pagespec, use `~` like this:
[ [!definepagespec name="bugs" spec="bugs/* and !*/Discussion"]]
[ [!definepagespec name="openbugs" spec="~bugs and !link(done)"]]
[ [!definepagespec name="readybugs" spec="~openbugs and !link(~openbugs)"]]
At the moment the problem is in `match_link()` when we're trying to find a sub-page that
matches the appropriate page spec. There is no good list of pages available to iterate over.
foreach my $nextpage (keys %IkiWiki::pagesources)
does not give me a good list of pages. I found the same thing when I was working on
this todo [[todo/Add_a_plugin_to_list_available_pre-processor_commands]].
> I'm not sure why iterating over `%pagesources` wouldn't work here, it's the same method
> used by anything that needs to match a pagespec against all pages..? --[[Joey]]
>> My uchecked hypothesis is that %pagesources is created after the refresh hook.
>> I've also been concerned about how globally defined pagespec shortcuts would interact with
>> the page dependancy system. Your idea of internally defined shortcuts should fix that. -- [[Will]]
>>> You're correct, the refresh hook is run very early, before pagesources
>>> is populated. (It will be partially populated on a refresh, but will
>>> not be updated to reflect new pages.) Agree that internally defined
>>> seems the way to go. --[[Joey]]
Immediately below is a patch which seems to basically work. Lots of debugging code is still there
and it needs a cleanup, but I thought it worth posting at this point. (I was having problems
with old style glob lists, so i just switched them off for the moment.)
The following three inlines work for me with this patch:
Bugs:
[ [!inline pages="define(~bugs, bugs/* and ! */Discussion) and ~bugs" archive="yes"]]
OpenBugs:
[ [!inline pages="define(~bugs, bugs/* and ! */Discussion) and define(~openbugs,~bugs and !link(done)) and ~openbugs" archive="yes"]]
ReadyBugs:
[ [!inline pages="define(~bugs, bugs/* and ! */Discussion) and define(~openbugs,~bugs and !link(done)) and define(~readybugs,~openbugs and !link(~openbugs)) and ~readybugs" archive="yes"]]
> Nice! Could the specfuncsref be passed in %params? I'd like to avoid
> needing to change the prototype of every pagespec function, since several
> plugins define them too. --[[Joey]]
>> Maybe - it needs more thought. I also considered it when I was going though changing all those plugins :).
>> My concern was that `%params` can contain other user-defined parameters,
>> e.g. `link(target, otherparameter)`, and that means that the specFuncs could be clobbered by a user (or other
>> weird security hole). I thought it better to separate it, but I didn't think about it too hard. I might move it to
>> the first parameter rather than the second. Ikiwiki is my first real perl hacking and I'm still discovering
>> good ways to write things in perl.
>>
>>>> `%params` contains the parameters passed to `pagespec_match`, not
>>>> user-supplied parameters. The user-supplied parameter to a function
>>>> like `match_glob()` or `match_link()` is passed in the second positional parameter. --[[Joey]]
>>>>> OK. That seems reasonable then. The only problem is that my PERLfu is not strong enough to make it
>>>>> work. I really have to wonder what substance was influencing the designers of PERL...
>>>>> I can't figure out how to use the %params. And I'm pissed off enough with PERL that I'm not going
>>>>> to try and figure it out any more. There are two patches below now. The first one uses an extra
>>>>> argument and works. The second one tries to use %params and doesn't - take your pick :-). -- [[Will]]
>> What do you think is best to do about `is_globlist()`? At the moment it requires that the 'second word', as
>> delimited by a space and ignoring parens, is 'and' or 'or'. This doesn't hold in the above example pagespecs (so I just hard wired it to 0 to test my patch).
>> My thought was just to search for 'and' or 'or' as words anywhere in the pagespec. Thoughts?
>>> Dunno, we could just finish deprecating it. Or change the regexp to
>>> skip over spaces in parens. (`/[^\s]+\s+([^)]+)/`) --[[Joey]]
>>>> I think I have a working regexp now.
>> Oh, one more thing. In pagespec_translate (now pagespec_makeperl), there is a part of the regular expression for `# any other text`.
>> This contained `()`, which has no effect. I replaced that with `\(\)`, but that is a change in the definition of pagespecs unrelated to the
>> rest of this patch. In a related change, commands were not able to contain `)` in their parameters. I've extended that so the cannot
>> contain `(` or `)`. -- [[Will]]
>>> `[^\s()]+` is a character class matching all characters not spaces or
>>> parens. Since the pervious terminals in the regexp consume most
>>> occurances of an open paren or close paren, it's unlikely for one to
>>> get through to that part of the regexp. For example, "foo()" will be
>>> matched by the command matcher; "(foo)" will be matched by the open
>>> paren literal terminal. "foo(" and "foo)" can get through to the
>>> end, and would be matched as a page name, if it didn't exclude parens.
>>>
>>> So why exclude them? Well, consider "foo and(bar and baz)". We don't
>>> want it to match "and(" as a page name!
>>>
>>> Escaping the parens in the character class actually changes nothing; the
>>> changed character class still matches all characters not spaces or
>>> parens. (Try it!).
>>>
>>> Re commands containing '(', I don't really see any reason not to
>>> allow that, unless it breaks something. --[[Joey]]
>>>> Oh, I didn't realise you didn't need to escape parens inside []. All else I
>>>> I understood. I have stopped commands from containing parens because
>>>> once you allow that then you might have a extra level of depth in the parsing
>>>> of define() statements. -- [[Will]]
>>> Updated patch. Moved the specFuncsRef to the front of the arg list. Still haven't thought through the security implications of
>>> having it in `%params`. I've also removed all the debugging `print` statements. And I've updated the `is_globlist()` function.
>>> I think this is ready for people other than me to have a play. It is not well enough tested to commit just yet.
>>> -- [[Will]]
I've lost track of the indent level, so I'm going back to not indented - I think this is a working [[patch]] taking into
account all comments above (which doesn't mean it is above reproach :) ). --[[Will]]
> Very belated code review of last version of the patch:
>
> * `is_globlist` is no longer needed
>> Good :)
> * I don't understand why the pagespec match regexp is changed
> from having flags `igx` to `ixgs`. Don't see why you
> want `.` to match '\n` in it, and don't see any `.` in the regexp
> anyway?
>> Because you have to define all the named pagespecs in the pagespec, you sometimes end up with very long pagespecs. I found it useful to split them over multiple lines. That didn't work at one point and I added the 's' to make it work. I may have further altered the regex since then to make the 's' redundant. Remove it and see if multi-line pagespecs still work. :)
>>> Well, I can tell you that multi-line pagespecs are supported w/o
>>> your patch .. I use them all the time. The reason I find your
>>> use of `/s` unlikely is because without it `\s` already matches
>>> a newline. Only if you want to treat a newline as non-whitespace
>>> is `/s` typically necessary. --[[Joey]]
> * Some changes of `@_` to `%params` in `pagespec_makeperl` do not
> make sense to me. I don't see where \%params is defined and populated,
> except with `\$params{specFunc}`.
>> I'm not a perl hacker. This was a mighty battle for me to get going.
>> There is probably some battlefield carnage from my early struggles
>> learning perl left here. Part of this is that @_ / @params already
>> existed as a way of passing in extra parameters. I didn't want to
>> pollute that top level namespace - just at my own parameter (a hash)
>> which contained the data I needed.
>>> I think I understand how the various `%params`
>>> (there's not just one) work in your code now, but it's really a mess.
>>> Explaining it in words would take pages.. It could be fixed by,
>>> in `pagespec_makeperl` something like:
>>>
>>> my %specFuncs;
>>> push @_, specFuncs => \%specFuncs;
>>>
>>> With that you have the hash locally available for populating
>>> inside `pagespec_makeperl`, and when the `match_*` functions
>>> are called the same hash data will be available inside their
>>> `@_` or `%params`. No need to change how the functions are called
>>> or do any of the other hacks.
>>>
>>> Currently, specFuncs is populated by building up code
>>> that recursively calls `pagespec_makeperl`, and is then
>>> evaluated when the pagespec gets evaluated. My suggested
>>> change to `%params` will break that, but that had to change
>>> anyway.
>>>
>>> It probably has a security hole, and is certianly inviting
>>> one, since the pagespec definition is matched by a loose regexp (`.*`)
>>> and then subject to string interpolation before being evaluated
>>> inside perl code. I recently changed ikiwiki to never interpolate
>>> user-supplied strings when translating pagespecs, and that
>>> needs to happen here too. The obvious way, it seems to me,
>>> is to not generate perl code, but just directly run perl code that
>>> populates specFuncs.
> * Seems that the only reason `match_glob` has to check for `~` is
> because when a named spec appears in a pagespec, it is translated
> to `match_glob("~foo")`. If, instead, `pagespec_makeperl` checked
> for named specs, it could convert them into `check_named_spec("foo")`
> and avoid that ugliness.
>> Yeah - I wanted to make named specs syntactically different on my first pass. You are right in that this could be made a fallback - named specs always override pagenames.
> * The changes to `match_link` seem either unecessary, or incomplete.
> Shouldn't it check for named specs and call
> `check_named_spec_existential`?
>> An earlier version did. Then I realised it wasn't actually needed in that case - match_link() already included a loop that was like a type of existential matching. Each time through the loop it would
>> call match_glob(). match_glob() in turn will handle the named spec. I tested this version briefly and it seemed to work. I remember looking at this again later and wondering if I had mis-understood
>> some of the logic in match_link(), which might mean there are cases where you would need an explicit call to check_named_spec_existential() - I never checked it properly after having that thought.
>>> In the common case, `match_link` does not call `match_glob`,
>>> because the link target it is being asked to check for is a single
>>> page name, not a glob.
> * Generally, the need to modify `match_*` functions so that they
> check for and handle named pagespecs seems suboptimal, if
> only because there might be others people may want to use named
> pagespecs with. It would be possible to move this check
> to `pagespec_makeperl`, by having it check if the parameter
> passed to a pagespec function looked like a named pagespec.
> The only issue is that some pagespec functions take a parameter
> that is not a page name at all, and it could be weird
> if such a parameter were accidentially interpreted as a named
> pagespec. (But, that seems unlikely to happen.)
>> Possibly. I'm not sure which I prefer between the current solution and that one. Each have advantages and disadvantages.
>> It really isn't much code for the match functions to add a call to check_named_spec_existential().
>>> But if a plugin adds its own match function, it has
>>> to explicitly call that code to support named pagespecs.
> * I need to check if your trick to avoid infinite recursion
> works if there are two named specs that recursively
> call one-another. I suspect it does, but will test this
> myself..
>> It worked for me. :)
> * I also need to verify if memoizing the named pagespecs has
> really guarded against very expensive pagespecs DOSing the wiki..
> --[[Joey]]
>> There is one issue that I've been thinking about that I haven't raised anywhere (or checked myself), and that is how this all interacts with page dependencies.
>> Firstly, I'm not sure anymore that the `pagespec_merge` function will continue to work in all cases.
>>> The problem I can see there is that if two pagespecs
>>> get merged and both use `~foo` but define it differently,
>>> then the second definition might be used at a point when
>>> it shouldn't (but I haven't verified that really happens).
>>> That could certianly be a show-stopper. --[[Joey]]
>> Secondly, it seems that there are two types of dependency, and ikiwiki
>> currently only handles one of them. The first type is "Rebuild this
>> page when any of these other pages changes" - ikiwiki handles this.
>> The second type is "rebuild this page when set of pages referred to by
>> this pagespec changes" - ikiwiki doesn't seem to handle this. I
>> suspect that named pagespecs would make that second type of dependency
>> more important. I'll try to come up with a good example. -- [[Will]]
>>> Hrm, I was going to build an example of this with backlinks, but it
>>> looks like that is handled as a special case at the moment (line 458 of
>>> render.pm). I'll see if I can breapk
>>> things another way. Fixing this properly would allow removal of that special case. -- [[Will]]
>>>> I can't quite understand the distinction you're trying to draw
>>>> between the two types of dependencies. Backlinks are a very special
>>>> case though and I'll be suprised if they fit well into pagespecs.
>>>> --[[Joey]]
----
diff --git a/IkiWiki.pm b/IkiWiki.pm
index 4e4da11..8b3cdfe 100644
--- a/IkiWiki.pm
+++ b/IkiWiki.pm
@@ -1550,7 +1550,16 @@ sub globlist_to_pagespec ($) {
sub is_globlist ($) {
my $s=shift;
- return ( $s =~ /[^\s]+\s+([^\s]+)/ && $1 ne "and" && $1 ne "or" );
+ return ! ($s =~ /
+ (^\s*
+ [^\s(]+ # single item
+ (\( # possibly with parens after it
+ ([^)]* # with stuff inside those parens
+ (\([^)]*\))*)* # maybe even nested parens
+ \))?\s*$
+ ) |
+ (\s and \s) | (\s or \s) # or we find 'and' or 'or' somewhere
+ /xs);
}
sub safequote ($) {
@@ -1631,7 +1640,7 @@ sub pagespec_merge ($$) {
return "($a) or ($b)";
}
-sub pagespec_translate ($) {
+sub pagespec_makeperl ($) {
my $spec=shift;
# Support for old-style GlobLists.
@@ -1650,12 +1659,14 @@ sub pagespec_translate ($) {
|
\) # )
|
- \w+\([^\)]*\) # command(params)
+ define\(\s*~\w+\s*,((\([^()]*\)) | ([^()]+))+\) # define(~specName, spec) - spec can contain parens 1 deep
+ |
+ \w+\([^()]*\) # command(params) - params cannot contain parens
|
[^\s()]+ # any other text
)
\s* # ignore whitespace
- }igx) {
+ }igxs) {
my $word=$1;
if (lc $word eq 'and') {
$code.=' &&';
@@ -1666,16 +1677,23 @@ sub pagespec_translate ($) {
elsif ($word eq "(" || $word eq ")" || $word eq "!") {
$code.=' '.$word;
}
- elsif ($word =~ /^(\w+)\((.*)\)$/) {
+ elsif ($word =~ /^define\(\s*~(\w+)\s*,(.*)\)$/s) {
+ $code .= " (\$params{specFuncs}->{$1}="; # (exists \$params{specFuncs}) &&
+ $code .= "memoize(";
+ $code .= &pagespec_makeperl($2);
+ $code .= ")";
+ $code .= ") ";
+ }
+ elsif ($word =~ /^(\w+)\((.*)\)$/s) {
if (exists $IkiWiki::PageSpec::{"match_$1"}) {
- $code.="IkiWiki::PageSpec::match_$1(\$page, ".safequote($2).", \@_)";
+ $code.="IkiWiki::PageSpec::match_$1(\$page, ".safequote($2).", \%params)";
}
else {
$code.=' 0';
}
}
else {
- $code.=" IkiWiki::PageSpec::match_glob(\$page, ".safequote($word).", \@_)";
+ $code.=" IkiWiki::PageSpec::match_glob(\$page, ".safequote($word).", \%params)";
}
}
@@ -1683,8 +1701,18 @@ sub pagespec_translate ($) {
$code=0;
}
+ return 'sub { my $page=shift; my %params = @_; '.$code.' }';
+}
+
+sub pagespec_translate ($) {
+ my $spec=shift;
+
+ my $code = pagespec_makeperl($spec);
+
+ # print STDERR "Spec '$spec' generated code '$code'\n";
+
no warnings;
- return eval 'sub { my $page=shift; '.$code.' }';
+ return eval $code;
}
sub pagespec_match ($$;@) {
@@ -1699,7 +1727,7 @@ sub pagespec_match ($$;@) {
my $sub=pagespec_translate($spec);
return IkiWiki::FailReason->new("syntax error in pagespec \"$spec\"") if $@;
- return $sub->($page, @params);
+ return $sub->($page, @params, specFuncs => {});
}
sub pagespec_valid ($) {
@@ -1748,11 +1776,78 @@ sub new {
package IkiWiki::PageSpec;
+sub check_named_spec($$;@) {
+ my $page=shift;
+ my $specName=shift;
+ my %params=@_;
+
+ error("Unable to find specFuncs in params to check_named_spec()!") unless exists $params{specFuncs};
+
+ my $specFuncsRef=$params{specFuncs};
+
+ return IkiWiki::FailReason->new("Named page spec '$specName' is not valid")
+ unless (substr($specName, 0, 1) eq '~');
+
+ $specName = substr($specName, 1);
+
+ if (exists $specFuncsRef->{$specName}) {
+ # remove the named spec from the spec refs
+ # when we recurse to avoid infinite recursion
+ my $sub = $specFuncsRef->{$specName};
+ delete $specFuncsRef->{$specName};
+ my $result = $sub->($page, %params);
+ $specFuncsRef->{$specName} = $sub;
+ return $result;
+ } else {
+ return IkiWiki::FailReason->new("Page spec '$specName' does not exist");
+ }
+}
+
+sub check_named_spec_existential($$$;@) {
+ my $page=shift;
+ my $specName=shift;
+ my $funcref=shift;
+ my %params=@_;
+
+ error("Unable to find specFuncs in params to check_named_spec_existential()!") unless exists $params{specFuncs};
+ my $specFuncsRef=$params{specFuncs};
+
+ return IkiWiki::FailReason->new("Named page spec '$specName' is not valid")
+ unless (substr($specName, 0, 1) eq '~');
+ $specName = substr($specName, 1);
+
+ if (exists $specFuncsRef->{$specName}) {
+ # remove the named spec from the spec refs
+ # when we recurse to avoid infinite recursion
+ my $sub = $specFuncsRef->{$specName};
+ delete $specFuncsRef->{$specName};
+
+ foreach my $nextpage (keys %IkiWiki::pagesources) {
+ if ($sub->($nextpage, %params)) {
+ my $tempResult = $funcref->($page, $nextpage, %params);
+ if ($tempResult) {
+ $specFuncsRef->{$specName} = $sub;
+ return $tempResult;
+ }
+ }
+ }
+
+ $specFuncsRef->{$specName} = $sub;
+ return IkiWiki::FailReason->new("No page in spec '$specName' was successfully matched");
+ } else {
+ return IkiWiki::FailReason->new("Named page spec '$specName' does not exist");
+ }
+}
+
sub match_glob ($$;@) {
my $page=shift;
my $glob=shift;
my %params=@_;
+ if (substr($glob, 0, 1) eq '~') {
+ return check_named_spec($page, $glob, %params);
+ }
+
my $from=exists $params{location} ? $params{location} : '';
# relative matching
@@ -1782,11 +1877,12 @@ sub match_internal ($$;@) {
sub match_link ($$;@) {
my $page=shift;
- my $link=lc(shift);
+ my $fulllink=shift;
my %params=@_;
+ my $link=lc($fulllink);
my $from=exists $params{location} ? $params{location} : '';
-
+
# relative matching
if ($link =~ m!^\.! && defined $from) {
$from=~s#/?[^/]+$##;
@@ -1804,19 +1900,32 @@ sub match_link ($$;@) {
}
else {
return IkiWiki::SuccessReason->new("$page links to page $p matching $link")
- if match_glob($p, $link, %params);
+ if match_glob($p, $fulllink, %params);
}
}
return IkiWiki::FailReason->new("$page does not link to $link");
}
sub match_backlink ($$;@) {
- return match_link($_[1], $_[0], @_);
+ my $page=shift;
+ my $backlink=shift;
+ my @params=@_;
+
+ if (substr($backlink, 0, 1) eq '~') {
+ return check_named_spec_existential($page, $backlink, \&match_backlink, @params);
+ }
+
+ return match_link($backlink, $page, @params);
}
sub match_created_before ($$;@) {
my $page=shift;
my $testpage=shift;
+ my @params=@_;
+
+ if (substr($testpage, 0, 1) eq '~') {
+ return check_named_spec_existential($page, $testpage, \&match_created_before, @params);
+ }
if (exists $IkiWiki::pagectime{$testpage}) {
if ($IkiWiki::pagectime{$page} < $IkiWiki::pagectime{$testpage}) {
@@ -1834,6 +1943,11 @@ sub match_created_before ($$;@) {
sub match_created_after ($$;@) {
my $page=shift;
my $testpage=shift;
+ my @params=@_;
+
+ if (substr($testpage, 0, 1) eq '~') {
+ return check_named_spec_existential($page, $testpage, \&match_created_after, @params);
+ }
if (exists $IkiWiki::pagectime{$testpage}) {
if ($IkiWiki::pagectime{$page} > $IkiWiki::pagectime{$testpage}) {
|