aboutsummaryrefslogtreecommitdiff
path: root/doc
diff options
context:
space:
mode:
authorhttp://schmonz.livejournal.com/ <http://schmonz.livejournal.com/@web>2009-08-02 10:29:15 -0400
committerJoey Hess <joey@kitenet.net>2009-08-02 10:29:15 -0400
commit4c21c5d1fa16972cf08583d96a05af87758e0936 (patch)
tree74a4497e30e6f144a7e6f7a4cc21a1249bb7f0ba /doc
parent81ad4377e90961a46d97248844f5fa7f26be3f24 (diff)
downloadikiwiki-4c21c5d1fa16972cf08583d96a05af87758e0936.tar
ikiwiki-4c21c5d1fa16972cf08583d96a05af87758e0936.tar.gz
ugh, this is not a Markdown page
Diffstat (limited to 'doc')
-rw-r--r--doc/forum/ikiwiki_over_database__63__.wiki24
1 files changed, 5 insertions, 19 deletions
diff --git a/doc/forum/ikiwiki_over_database__63__.wiki b/doc/forum/ikiwiki_over_database__63__.wiki
index fb4d41763..b6e7266e3 100644
--- a/doc/forum/ikiwiki_over_database__63__.wiki
+++ b/doc/forum/ikiwiki_over_database__63__.wiki
@@ -1,21 +1,7 @@
Is there here any possibility to modifying ikiwiki (via plugin) for store pages in database. I'm thinking about storing pages in sqlite or mysql for serving it much faster. The idea is from sputnik.org [http://sputnik.freewisdom.org/] but with perl ;-). Could we integrate the sputnik code in ikiwiki as a solution?
-> ikiwiki generates static pages in a filesystem. It's responsible
-> for editing and regenerating them, but they're served by any old
-> web server. If you go to the trouble of stuffing the generated pages
-> into a database, you'll need to go to further trouble to serve them
-> back out somehow: write your own web server, perhaps, or a module
-> for a particular web server. Either way you'll have sacrificed
-> ikiwiki's interoperability, and it's not at all clear (since you're
-> adding, in the best case, one layer of indirection reading the
-> generated files) you'll have gained any improved page-serving
-> performance. If it's source pages you want to store in a database,
-> then you lose the ability to do random Unixy things to source pages,
-> including managing them in a revision control system.
->
-> Static HTML pages in a filesystem and the ability to do random
-> Unixy things are two of the uniquely awesome features of ikiwiki.
-> It's probably possible to do what you want, but it's unlikely that
-> you really want it. I'd suggest you either get to know ikiwiki better,
-> or choose one of the many wiki implementations that already works
-> as you describe. --[[Schmonz]]
+-----
+
+ikiwiki generates static pages in a filesystem. It's responsible for editing and regenerating them, but they're served by any old web server. If you go to the trouble of stuffing the generated pages into a database, you'll need to go to further trouble to serve them back out somehow: write your own web server, perhaps, or a module for a particular web server. Either way you'll have sacrificed ikiwiki's interoperability, and it's not at all clear (since you're adding, in the best case, one layer of indirection reading the generated files) you'll have gained any improved page-serving performance. If it's source pages you want to store in a database, then you lose the ability to do random Unixy things to source pages, including managing them in a revision control system.
+
+Static HTML pages in a filesystem and the ability to do random Unixy things are two of the uniquely awesome features of ikiwiki. It's probably possible to do what you want, but it's unlikely that you really want it. I'd suggest you either get to know ikiwiki better, or choose one of the many wiki implementations that already works as you describe. --[[Schmonz]]