| Commit message (Collapse) | Author | Age |
|
|
|
|
|
|
|
|
| |
Now that dependencies are a list of pagespecs with an implicit "or"
operation, there's no need to try to merge pagespecs under normal use.
ikiwiki-transition contains the only use of the function, so move
it there rather than deleting it entirely (it's used to concatenate all
admins' lists of locked pages).
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
On a large wiki you can spend a lot of time reading through large lists
of dependencies to see whether files need to be rebuilt (album, with its
one-page-per-photo arrangement, suffers particularly badly from this).
The dependency list is currently a single pagespec, but it's not used like
a normal pagespec - in practice, it's a list of pagespecs joined with the
"or" operator.
Accordingly, change it to be stored as a list of pagespecs. On a wiki
with many tagged photo albums, this reduces the time to refresh after
`touch tags/*.mdwn` from about 31 to 25 seconds.
Getting the benefit of this change on an existing wiki requires a rebuild.
|
|
|
|
| |
by plugins in the index. Fix this bug.
|
| |
|
|
|
|
| |
need a srcdir.
|
|
|
|
| |
.ikiwiki, abort with an error rather than creating it.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This is sorta an optimisation, and sorta a bug fix. In one
test case I have available, it can speed a page build up from 3
minutes to 3 seconds.
The root of the problem is that $links{$page} contains arrays of
links, rather than hashes of links. And when a link is found,
it is just pushed onto the array, without checking for dups.
Now, the array is emptied before scanning a page, so there
should not be a lot of opportunity for lots of duplicate links
to pile up in it. But, in some cases, they can, and if there
are hundreds of duplicate links in the array, then scanning it
for matching links, as match_link and some other code does,
becomes much more expensive than it needs to be.
Perhaps the real right fix would be to change the data structure
to a hash. But, the list of links is never accessed like that,
you always want to iterate through it.
I also looked at deduping the list in saveindex, but that does
a lot of unnecessary work, and doesn't completly solve the problem.
So, finally, I decided to add an add_link function that handles deduping,
and make ikiwiki-transition remove the old dup links.
|
| |
|
|
|
|
|
| |
This is easier to remeber, and less error-prone than passing it all the
pages in the wiki.
|
|
|
|
|
|
|
|
|
|
| |
A new ikiwiki-transition moveprefs subcommand can pull the old data out of
the userdb and inject it into the setup file.
Note that it leaves the old values behind in the userdb too. I did this
because I didn't want to lose data if it fails writing the setup file for
some reason, and the old data in the userdb will only use a small amount of
space. Running the command multiple times will mostly not change anything.
|
| |
|
| |
|
|
|
|
|
| |
Also fixed a bug in how aggregateinternal used IkiWiki::Setup::load,
and added checks for arguments to other subcommands.
|
| |
|
|
|
|
|
|
|
|
|
|
| |
Usage:
1. Update all pagespecs that use aggregated pages to use internal()
2. ikiwiki-transition aggregateinternal $srcdir $htmlext
(where $srcdir and $htmlext are the srcdir and htmlext options in
your .setup file)
3. Add aggregateinternal to your .setup file
4. Rebuild the wiki
|
|
|
|
| |
transition works again.
|
|
|
|
|
|
|
| |
This implements the previously documented hashed password support.
While implementing that, I noticed a security hole, which this commit
also fixes..
|
| |
|
| |
|
|
If we have transitions of this sort in the future, this program will
hopefully be used to handle them too.
|