aboutsummaryrefslogtreecommitdiff
path: root/doc/todo/multi-thread_ikiwiki.mdwn
diff options
context:
space:
mode:
authorhttp://kerravonsen.dreamwidth.org/ <http://kerravonsen.dreamwidth.org/@web>2012-01-26 21:57:21 -0400
committeradmin <admin@branchable.com>2012-01-26 21:57:21 -0400
commit6226d1a7656954f6636932aec1f9badea582bafa (patch)
tree75ce30637e957e05a3940cf283a7d64385412d79 /doc/todo/multi-thread_ikiwiki.mdwn
parent0d48385bd3fe77f74d9f99f464f1f231eba9a799 (diff)
downloadikiwiki-6226d1a7656954f6636932aec1f9badea582bafa.tar
ikiwiki-6226d1a7656954f6636932aec1f9badea582bafa.tar.gz
unofficial opinion
Diffstat (limited to 'doc/todo/multi-thread_ikiwiki.mdwn')
-rw-r--r--doc/todo/multi-thread_ikiwiki.mdwn4
1 files changed, 4 insertions, 0 deletions
diff --git a/doc/todo/multi-thread_ikiwiki.mdwn b/doc/todo/multi-thread_ikiwiki.mdwn
index 1494fed7a..396037fa7 100644
--- a/doc/todo/multi-thread_ikiwiki.mdwn
+++ b/doc/todo/multi-thread_ikiwiki.mdwn
@@ -6,3 +6,7 @@ Lots of \[[!img ]] (~2200), lots of \[[!teximg ]] (~2700). A complete rebuild ta
We could use a big machine, with plenty of CPUs. Could some multi-threading support be added to ikiwiki, by forking out all the external heavy plugins (imagemagick, tex, ...) and/or by processing pages in parallel?
Disclaimer: I know nothing of the Perl approach to parallel processing.
+
+> I agree that it would be lovely to be able to use multiple processors to speed up rebuilds on big sites (I have a big site myself), but, taking a quick look at what Perl threads entails, and taking into acount what I've seen of the code of IkiWiki, it would take a massive rewrite to make IkiWiki thread-safe - the API would have to be completely rewritten - and then more work again to introduce threading itself. So my unofficial humble opinion is that it's unlikely to be done.
+> Which is a pity, and I hope I'm mistaken about it.
+> --[[KathrynAndersen]]