aboutsummaryrefslogtreecommitdiff
path: root/doc/ikiwiki-transition.mdwn
diff options
context:
space:
mode:
authorJoey Hess <joey@gnu.kitenet.net>2009-05-05 23:40:09 -0400
committerJoey Hess <joey@gnu.kitenet.net>2009-05-06 00:27:24 -0400
commit2a7721febd6cac1af5e7f4b4949ffe066c62c837 (patch)
treec0e488da71e36ce1842e2553e2cf683e49d15676 /doc/ikiwiki-transition.mdwn
parent1c7c9e95f227a3ff7906c000ec15bb163edc463f (diff)
downloadikiwiki-2a7721febd6cac1af5e7f4b4949ffe066c62c837.tar
ikiwiki-2a7721febd6cac1af5e7f4b4949ffe066c62c837.tar.gz
Avoid %links accumulating duplicates. (For TOVA)
This is sorta an optimisation, and sorta a bug fix. In one test case I have available, it can speed a page build up from 3 minutes to 3 seconds. The root of the problem is that $links{$page} contains arrays of links, rather than hashes of links. And when a link is found, it is just pushed onto the array, without checking for dups. Now, the array is emptied before scanning a page, so there should not be a lot of opportunity for lots of duplicate links to pile up in it. But, in some cases, they can, and if there are hundreds of duplicate links in the array, then scanning it for matching links, as match_link and some other code does, becomes much more expensive than it needs to be. Perhaps the real right fix would be to change the data structure to a hash. But, the list of links is never accessed like that, you always want to iterate through it. I also looked at deduping the list in saveindex, but that does a lot of unnecessary work, and doesn't completly solve the problem. So, finally, I decided to add an add_link function that handles deduping, and make ikiwiki-transition remove the old dup links.
Diffstat (limited to 'doc/ikiwiki-transition.mdwn')
-rw-r--r--doc/ikiwiki-transition.mdwn7
1 files changed, 7 insertions, 0 deletions
diff --git a/doc/ikiwiki-transition.mdwn b/doc/ikiwiki-transition.mdwn
index 18836d5f5..e0b853ecf 100644
--- a/doc/ikiwiki-transition.mdwn
+++ b/doc/ikiwiki-transition.mdwn
@@ -61,6 +61,13 @@ If this is not done explicitly, a user's plaintext password will be
automatically converted to a hash when a user logs in for the first time
after upgrade to ikiwiki 2.48.
+# deduplinks srcdir
+
+In the past, bugs in ikiwiki have allowed duplicate link information
+to be stored in its indexdb. This mode removes such duplicate information,
+which may speed up wikis afflicted by it. Note that rebuilding the wiki
+will have the same effect.
+
# AUTHOR
Josh Triplett <josh@freedesktop.org>, Joey Hess <joey@ikiwiki.info>