| Commit message (Collapse) | Author | Age |
|
|
|
|
|
|
| |
don't go on and --refresh.
This way, if a previous aggregation job is running, we don't add additional
load doing work that job will do anyway.
|
|
|
|
|
|
|
|
| |
Previously, prune("wiki/srcdir/sandbox/test.mdwn") could delete srcdir
or even wiki, if they happened to be empty. This is rarely what you
want: there's usually some base directory (destdir, srcdir, transientdir
or another subdirectory of wikistatedir) beyond which you do not want to
delete.
|
|
|
|
| |
Let's just try to write and fall back to a short ugly filename on error.
|
|
|
|
|
|
|
|
| |
Two problems fixed:
1. Files are written with a .ikiwiki-new suffix, which has to be taken into
account.
2. Need to count length of bytes, not of unicode characters.
|
|
|
|
|
|
|
| |
connections.
Making outgoing ipv6 connections for openid auth is still broken; the glue
module does not seem to solve that, so I did not make openid use it.
|
| |
|
| |
|
|
|
|
|
|
|
| |
Since the plugin abuses the checkconfig hook to launch aggregation when in
--aggregate mode, it should give other plugins that have checkconfig hooks
a chance to run before they are possibly used in rendering the aggregated
content.
|
|
|
|
| |
cookiejar configuration setting can be used by other plugins to provide a custom `cookie_jar` object for LWP::UserAgent. (Thanks, schmonz)
|
|
|
|
|
|
| |
Assume the aggregated content is only going to be in one of the
directories, and so stop if it's successfully removed from the
transientdir.
|
| |
|
| |
|
|
|
|
| |
That template is user-controlled.
|
|
|
|
| |
array of things that need built. (Backwards compatability code keeps plugins using the old interface working.)
|
|
|
|
| |
Not needed; lastupdate will be 0 for new feeds.
|
| |
|
|
|
|
| |
.ikiwiki/aggregatetime, to allow for more sophisticated cron jobs.
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
| |
Besides being wrong to do, this could lead to the wrong item
being expired, as follows: If B is added and at the same time
A is changed, then A's ctime may be set to the current time,
while B's is set to its creation time. Thus the new item, A,
is incorrectly removed as older.
(This interacted especially badly with the bug fixed by
90b4d079605b72bb50d1da41402d994960e10937.)
|
|
|
|
|
|
|
| |
The aggregate state merge code neglected to merge changes to the md5
field of an item. Therefore, if an item's md5 changed after initial
aggregation, it would be updated, and rewritten, each time thereafter.
This was wasteful and indirectly led to some expire problems.
|
|
|
|
|
|
|
|
|
| |
See [[bugs/Aggregated_Atom_feeds_are_double-encoded]]. By default,
XML::Atom outputs strings of UTF-8 bytes with the Perl UTF8 flag stripped
off, which IkiWiki assumes to be Latin-1 and re-encodes as UTF-8 on
output. XML::Feed does not currently (0.41-1) set the magic variable to
change this behaviour (I've filed a bug on CPAN), but IkiWiki can
usefully set the same variable as a workaround.
|
| |
|
| |
|
|
|
|
|
| |
This can happen when a new field,
such as the new lasttry, is added.
|
|
|
|
| |
aggregation is run, even if the usual time has not passed. Closes: #508622 (Michael Gold)
|
| |
|
| |
|
|
|
|
| |
The old method failed for '[' x 3.
|
|
|
|
|
|
|
| |
holger reported that decode_utf8 was crashing with perl 5.8.8. Earlier, I
thought that passing 0 to the function avoided this with old perls, but
that was apparently not enough, it still crashes. So, put it inside the
eval, so we can at least recover from it crashing.
|
|
|
|
| |
links. Since this needs the just released XML::Feed 0.3, as well as a not yet released XML::RSS, it will fall back to the old method if no xml:base info is available.
|
|
|
|
|
|
| |
The machine parseable date needs to include a timezone.
Also, simplified the interface for date display.
|
| |
|
|
|
|
| |
in the future.
|
|
|
|
|
|
|
|
| |
newpagefile.
Note that newpagefile is not used here (or in recentchanges) because
the internal use pages they generate are transient and unlikely to
benefit from being put each in their own subdir.
|
| |
|
| |
|
|
|
|
|
|
|
| |
I saw this in the wild, apparently a page was not present on disk, but was
in the aggregate db, and not marked as expired either. Not sure how that
happened, but such pages should get marked as expired since they have an
effectively zero ctime.
|
| |
|
|
|
|
|
| |
The expiry code does need to make sure to sort in ctime order, even if
expiring by count, so it expires the right ones.
|
|
|
|
| |
elements.
|
|
|
|
| |
needs to wait for the pages to be rendered though)
|
|
|
|
| |
too many plugins.. brain exploding..
|
|
|
|
|
| |
They were a bit confusing, since they did not actually set the default, and
example values are sufficient.
|
| |
|
| |
|
| |
|
|
|
|
| |
This handles deleting empty directories too.
|
| |
|
|\
| |
| |
| |
| |
| | |
Conflicts:
IkiWiki/Plugin/aggregate.pm
|