diff options
author | Jon Dowland <jon@alcopop.org> | 2008-11-10 23:34:09 +0000 |
---|---|---|
committer | Jon Dowland <jon@alcopop.org> | 2008-11-10 23:34:09 +0000 |
commit | f28069a05c063c8583b207333a14ed353874d89c (patch) | |
tree | e58a721e93d479ed2a0ca7d59b1ef3476adfb734 | |
parent | c1fa07ad4f165b42c962ba2a310681107f38c4f7 (diff) | |
download | ikiwiki-f28069a05c063c8583b207333a14ed353874d89c.tar ikiwiki-f28069a05c063c8583b207333a14ed353874d89c.tar.gz |
add discussion on this tip
-rw-r--r-- | doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn | 14 |
1 files changed, 14 insertions, 0 deletions
diff --git a/doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn b/doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn new file mode 100644 index 000000000..6e5f1668a --- /dev/null +++ b/doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn @@ -0,0 +1,14 @@ +The u32 page is excellent, but I wonder if documenting the procedure here +would be worthwhile. Who knows, the remote site might disappear. But also +there are some variations on the approach that might be useful: + + * using a python script and the dom library to extract the page names from + Special:Allpages (such as + <http://www.staff.ncl.ac.uk/jon.dowland/unix/docs/get_pagenames.py>) + * Or, querying the mysql back-end to get the names + * using WWW::MediaWiki for importing/exporting pages from the wiki, instead + of Special::Export + +Also, some detail on converting mediawiki transclusion to ikiwiki inlines... + +-- [[JonDowland]] |